Compare commits

...

87 Commits

Author SHA1 Message Date
Travis
374239777e Automatic translation update 2017-07-19 07:55:50 +00:00
Safihre
9a7701d7e6 Update text files for 2.2.0Beta1 2017-07-19 09:33:00 +02:00
Safihre
01ff04f338 Allow Aborting of Direct Unpack during PP and add Completed label 2017-07-19 09:27:24 +02:00
Safihre
eac39767dd Renames on Retry only when defined
Otherwise if it's None, later this will happen:
original_filename = self.renames.get(nzf.filename, '')
AttributeError: 'NoneType' object has no attribute 'get'
2017-07-19 09:23:58 +02:00
Safihre
0d0adf99fa Proper counting of bad articles for DirectUnpack & Prospective Par2 2017-07-18 22:07:56 +02:00
Safihre
16905ce34f Show filename for Unzip instead of Path and show start of Verification 2017-07-18 21:16:05 +02:00
Safihre
5287fa8a0c Stability improvements for Direct Unpack
Now shows the time spent in unpacking and many other bugs squased.
2017-07-18 21:15:30 +02:00
Safihre
b72ab4fb8e Allow concurrent unpacking 2017-07-18 15:14:28 +02:00
Safihre
81054c675c Mimimum speed for Direct Unpack lowered to 40MB/s
It is tested during downloading, so if 40MB/s is still possible then we should be good to go.
2017-07-18 13:51:13 +02:00
Safihre
7362be8748 Group cfg settings by Config section
It was a big mess. 
Now they still could be sorted within each section.. next time.
2017-07-17 20:42:54 +02:00
Travis
b4ba2b3463 Automatic translation update 2017-07-17 18:33:42 +00:00
Safihre
8bed6938c1 Change text in DirectUnpack Enabled message
See also #966
2017-07-17 20:11:50 +02:00
Safihre
ecf16f6201 Show DirectUnpack progress the same as Unpack progress: xx/xx 2017-07-17 17:07:44 +02:00
Safihre
bf240357df Regressions in preparation of extraction path
Thanks @Cpuroast
2017-07-17 16:45:58 +02:00
Safihre
ddcf447957 Add missing save_config after modifying settings
Closes #966
2017-07-17 10:10:20 +02:00
Safihre
d9642611e2 Correct error in missing notify options
#966
2017-07-16 20:28:15 +02:00
Safihre
0018c6f263 Move regex to top and increase save-timeout 2017-07-16 19:26:35 +02:00
Safihre
6398bfa12f Use speed from download-log instead of re-calculating
Closes #829
2017-07-16 19:22:52 +02:00
Safihre
01dfb7538d Correct FileList Move to Top/Bottom CSS for Firefox 2017-07-16 14:28:04 +02:00
Safihre
3f0d4675b6 Fix CSS for Direct Unpack and Move to Top/Bottom 2017-07-16 14:18:50 +02:00
Safihre
f23c5caf80 Fix typo in DirectRenamer for non-Windows 2017-07-16 13:55:36 +02:00
Safihre
bd22430b26 Update text files for 2.2.0Alpha3 2017-07-16 11:04:17 +02:00
Safihre
1189a7fdbc Use tuple in endswith for Direct Unpack
Thanks @hellowlol
2017-07-16 10:59:00 +02:00
Safihre
f3aa4f84fc Remove waiting-time between URLGrab's
Other newsreaders grab multiple URL's at once, so no need for us to wait.
2017-07-16 10:40:39 +02:00
Safihre
ea26ce4700 Remove non-seperator RSS-url commas by detecting if they are valid URLs
Closes #965
2017-07-16 10:30:09 +02:00
Safihre
a1e649b7e2 Correct error in PAR_Verify with renames 2017-07-15 23:43:32 +02:00
Safihre
3b9f2b2cf0 Remove par2classic/cmdline for Windows and macOS 2017-07-15 23:33:20 +02:00
Safihre
7333d19e1c Notifications selection based on Categories
Closes #716
2017-07-15 22:22:20 +02:00
Safihre
232d537d23 Correct Direct Unpack locking behavior for multisets 2017-07-15 17:02:20 +02:00
Safihre
c6e17e7bcb Duplicate par2-16k values need force-remove 2017-07-15 17:02:20 +02:00
Safihre
54c6fd55dd Detection of forbidden-Windows names altered
Now we already sanatize the name during Assembler and when we have to make decisions for Unrar/Par2 we need to know if they might create something unsafe.
2017-07-15 17:02:20 +02:00
Safihre
0625aa1ca8 Make sure all Par2-16k signatures are unique, also in multisets 2017-07-15 17:02:20 +02:00
Safihre
83643f3298 Remove allow_streaming
Bit redundant now we have DirectUnpack
2017-07-15 17:02:20 +02:00
Safihre
ff3c46fe1f Remove enable_meta 2017-07-15 17:02:20 +02:00
Safihre
0930f0dcee Test disk-speed first time DirectUnpack is called 2017-07-15 17:02:20 +02:00
Safihre
3221257310 UnRar's ERROR is also an error
And add starting file to log.
2017-07-15 17:02:20 +02:00
Safihre
8048a73156 Handle active DirectUnpacker in postproc better 2017-07-15 17:02:20 +02:00
Safihre
ea552cd402 Cancel DirectUnpack when the final name changes 2017-07-15 17:02:20 +02:00
Safihre
dcb925f621 Case insensitive matching for DirectUnpack sets 2017-07-15 17:02:20 +02:00
Safihre
cce91e1985 DirectUnpacker should stay to listen to new sets 2017-07-15 17:02:20 +02:00
Safihre
e17d417c2e Re-introduce locks for TryList
After studying everything, it really needs it. Closes #738
2017-07-15 17:02:20 +02:00
Safihre
a69f5bd2df Prevent DirectUnpack locking the PostProcessing 2017-07-15 17:02:20 +02:00
Safihre
97e53eb4d3 Better DirectUnpack percentage counter 2017-07-15 17:02:20 +02:00
Safihre
a6da2b7bee Prevent possible crash in par2_repair 2017-07-15 17:02:20 +02:00
Safihre
4a21e7c217 Show percentage of DirectUnpack, when available 2017-07-15 17:02:20 +02:00
Safihre
9bd3c7be44 Increase maximum number of unpackers
Unrar takes almost no memory anyway
2017-07-15 17:02:20 +02:00
Safihre
434f5c4b2d Remove Audio/Video quality rating icons from Queue 2017-07-15 17:02:20 +02:00
Safihre
d3cc4f9f07 Direct Unpack indicator for Queue 2017-07-15 17:02:20 +02:00
Safihre
a16aa17c17 Don't start when not set to +Unpack and abort if Category changed 2017-07-15 17:02:20 +02:00
Safihre
68445d0409 Full working implementation of DirectUnpack with multi-sets 2017-07-15 17:02:20 +02:00
Safihre
32b68a45cc Integrate with PostProc 2017-07-15 17:02:20 +02:00
Safihre
345f8359cc Unpack to the right directory (with Sorter support) 2017-07-15 17:02:20 +02:00
Safihre
81f9886584 Add Direct Unpack to Config 2017-07-15 17:02:20 +02:00
Safihre
adbc618808 Improvements to detection of volumes 2017-07-15 17:02:20 +02:00
Safihre
41eafc6b4b Become set-specific 2017-07-15 17:02:20 +02:00
Safihre
9f18d8e8c1 Basic working Direct Unpack
Lots to do
2017-07-15 17:02:20 +02:00
Safihre
8c2c853166 Make sure to always have lowest part number 2017-07-15 17:02:20 +02:00
Safihre
97914906a0 Also handle GNTP errors during sending 2017-07-14 14:43:39 +02:00
Safihre
f1ce4ed19b Correctly handle new GNTP errors 2017-07-14 14:41:03 +02:00
Safihre
99185d8151 Update GNTP to 1.0.3
Closes #334
2017-07-14 14:25:07 +02:00
Safihre
385b6b7ade Remove QCHECK_FILE again 2017-07-14 14:25:07 +02:00
gwyden
81ea513f8c Added buttons and logic to move to top and bottom of download queue (#962)
* added buttons and logic to move to top and bottom of queue
* allowed for a larger control box for the new buttons
* Cleanup of unnecessary code
* Simple top and bottom of queue using existing queue data
2017-07-13 23:52:43 +02:00
Safihre
336b1ddba3 Always remove forbidden Win-devices from filenames
This breaks support for par2cmdline on Windows with forbidden names. Assuming no users that have disabled both Multipar *and* par2_multicore
2017-07-12 18:38:19 +02:00
Safihre
7274973322 Shorten par_cleanup code 2017-07-12 18:38:19 +02:00
Safihre
af132965de Revert "Remove QCHECK_FILE, not needed"
This reverts commit 4f8cc3f697.
2017-07-12 18:38:19 +02:00
Safihre
5586742886 Use RarFile.volumelist to get list of used rar-volumes 2017-07-12 18:38:19 +02:00
Safihre
5868b51490 Use fix to allow unicode arguments to POpen on Windows 2017-07-12 18:38:19 +02:00
Travis
7f17a38b9b Automatic translation update 2017-07-12 15:10:14 +00:00
Safihre
415e843ebb Remove 'WARNING:' label from Assembler warnings
It was inconsistent with other messages
2017-07-11 13:33:50 +02:00
Safihre
7ffc1192bb Only par2-rename when actually different 2017-07-11 12:00:36 +02:00
Safihre
945e769a03 Also performe prospective-par2 on renamed files 2017-07-10 23:06:05 +02:00
Safihre
86c7fb86cc Ignore first-16k par2 info if it's not unique 2017-07-10 22:51:17 +02:00
Safihre
ff20f3f620 Fix possible unicode error in tvsort and typo in newsunpack
Closes #950
2017-07-10 21:56:07 +02:00
Safihre
e8bef94706 Correctly handle renames on (multiple) retries 2017-07-10 21:03:37 +02:00
Safihre
d05fe2d680 More uniform handeling of renames 2017-07-10 20:53:31 +02:00
Safihre
4f8cc3f697 Remove QCHECK_FILE, not needed 2017-07-10 19:54:59 +02:00
Safihre
6fa619fa37 More robust renaming based on par2 first-16k info
Also when the correct name is
2017-07-10 17:40:39 +02:00
Safihre
a43f5369ea Do not rename .par2 filenames from NZB
They are usually correct, if mentioned at all
2017-07-10 17:29:34 +02:00
Safihre
2040173dc2 Rename parts of Assembler to be more coherent 2017-07-10 17:20:03 +02:00
Safihre
a15b7ec7ac Remove Windows utf8 detection using par2
Obsolute now we have Multipar
2017-07-10 17:17:12 +02:00
Safihre
6adcf2ce10 Stylistic changes from previous commits 2017-07-10 17:11:32 +02:00
Safihre
e756b9b5c1 Correct filenames while downloading using first-16kb par2 info
Maybe we can also do DirectUnpack!
2017-07-10 17:07:16 +02:00
Safihre
b3de745849 Do not use article-filename if it looks obfuscated 2017-07-10 15:54:17 +02:00
Safihre
77f3dc18b5 Corrections of Move To Top for filelists 2017-07-09 19:51:18 +03:00
gwyden
6b2f15f82e Move To Top/Move To Bottom buttons for filelists (#959)
* Control creation

* JQuery to make the buttons work

* minor text fixes

* tab to spaces cleanup

* style additions and removed hard text from code

* Moved button control to modal finish render event, gave file details a little more room

* Moved control to replace age and size on mouseover

* Added margins and color corrected for the night theme

* resolved night theme readability

* move to working top and bottom

* controls would lose event bindings after the append.  Detach first then insert

* Move to Top and Bottom buttons for files in each NZB
2017-07-09 18:34:33 +02:00
Safihre
570e58611d Repair would fail if extrapars were deleted by previous run
Closes #961
2017-07-06 18:30:31 +03:00
Safihre
6b69010aec Add logging for missing NZF database to debug #952 2017-06-28 11:35:52 +02:00
65 changed files with 3098 additions and 3347 deletions

View File

@@ -1,7 +1,7 @@
Metadata-Version: 1.0
Name: SABnzbd
Version: 2.2.0Alpha2
Summary: SABnzbd-2.2.0Alpha2
Version: 2.2.0Beta1
Summary: SABnzbd-2.2.0Beta1
Home-page: https://sabnzbd.org
Author: The SABnzbd Team
Author-email: team@sabnzbd.org

View File

@@ -1,4 +1,4 @@
Release Notes - SABnzbd 2.2.0 Alpha 2
Release Notes - SABnzbd 2.2.0 Beta 1
=========================================================
NOTE: Due to changes in this release, the queue will be converted when 2.2.0
@@ -6,41 +6,51 @@ is started for the first time. Job order, settings and data will be
preserved, but all jobs will be unpaused and URL's that did not finish
fetching before the upgrade will be lost!
We now also accept donations via Bitcoin, Ethereum and Litecoin:
https://sabnzbd.org/donate/
## Bugfixes since Alpha 3
- Bugfixes and stability updates for Direct Unpack
- Notification errors
- Correct value in "Speed" Extra History Column
## Changes since Alpha 1
## Changes since 2.1.0
- Direct Unpack: Jobs will start unpacking during the download, reduces
post-processing time but requires capable hard drive. Only works for jobs that
do not need repair. Will be enabled if your incomplete folder-speed > 60MB/s
- Reduced memory usage, especially with larger queues
- Removed 5 second delay between fetching URLs
- Notifications can now be limited to certain Categories
- Each item in the Queue and Filelist now has Move to Top/Bottom buttons
- Smoother animations in Firefox (disabled previously due to FF high-CPU usage)
- Jobs outside server retention are processed faster
- Show missing articles in MB instead of number of articles
- 'Download all par2' will download all par2 if par_cleanup is disabled
- Windows: Move enable_multipar to Specials (so MultiPar is always used)
- Obfuscated filenames are renamed during downloading, if possible
- If enable_par_cleanup is disabled all par2 files be downloaded
- If enabled, replace dots in filenames also when there are spaces already
- Update GNTP bindings to 1.0.3
- max_art_opt and replace_illegal moved from Switches to Specials
- Removed Specials enable_meta, par2_multicore and allow_streaming
- Windows: Full unicode support when calling repair and unpack
- Windows: Move enable_multipar to Specials
- Windows: Better indication of verification process before and after repair
- Windows: MultiPar verification of a job is skipped after blocks are fetched
- Windows & macOS: removed par2cmdline in favor of par2tbb/Multipar
## Bugfixes since Alpha 1
## Bugfixes since 2.1.0
- Shutdown/suspend did not work on some Linux systems
- Deleting a job could result in write errors
- Display warning if custom par2 parameters are wrong
- RSS URLs with commas were broken
- Fixed some "Saving failed" errors
- Fixed crashing URLGrabber
- Jobs with renamed files are now correctly handled when using Retry
- Disk-space readings could be updated incorrectly
- Correct redirect after enabling HTTPS in the Config
- Fix race-condition in Post-processing
- History would not always show latest changes
- Convert HTML in error messages
- Fixed unicode error during Sorting
- Not all texts were shown in the selected Language
## Changes in 2.2.0
- Reduced memory usage, especially with larger queues
- Slight improvement in download performance by removing internal locks
- Smoother animations in Firefox (disabled previously due to FF high-CPU usage)
- If enabled, replace dots in filenames also when there are spaces already
- Jobs outside server retention are processed faster
- max_art_opt and replace_illegal moved from Switches to Specials
## Bugfixes in 2.2.0
- Shutdown/suspend did not work on some Linux systems
- Deleting a job could result in write errors
- Display warning if custom par2 parameters are wrong
- macOS: Catch 'Protocol wrong type for socket' errors
- Windows: Fix error in MultiPar-code when first par2-file was damaged
- macOS: Catch 'Protocol wrong type for socket' errors
## Translations
- Added Hebrew translation by ION IL, many other languages updated.

View File

@@ -428,9 +428,6 @@ def print_modules():
else:
logging.error(T('par2 binary... NOT found!'))
if sabnzbd.newsunpack.PAR2C_COMMAND:
logging.info("par2cmdline binary... found (%s)", sabnzbd.newsunpack.PAR2C_COMMAND)
if sabnzbd.newsunpack.MULTIPAR_COMMAND:
logging.info("MultiPar binary... found (%s)", sabnzbd.newsunpack.MULTIPAR_COMMAND)

View File

@@ -1,509 +0,0 @@
import re
import hashlib
import time
import StringIO
__version__ = '0.8'
#GNTP/<version> <messagetype> <encryptionAlgorithmID>[:<ivValue>][ <keyHashAlgorithmID>:<keyHash>.<salt>]
GNTP_INFO_LINE = re.compile(
'GNTP/(?P<version>\d+\.\d+) (?P<messagetype>REGISTER|NOTIFY|SUBSCRIBE|\-OK|\-ERROR)' +
' (?P<encryptionAlgorithmID>[A-Z0-9]+(:(?P<ivValue>[A-F0-9]+))?) ?' +
'((?P<keyHashAlgorithmID>[A-Z0-9]+):(?P<keyHash>[A-F0-9]+).(?P<salt>[A-F0-9]+))?\r\n',
re.IGNORECASE
)
GNTP_INFO_LINE_SHORT = re.compile(
'GNTP/(?P<version>\d+\.\d+) (?P<messagetype>REGISTER|NOTIFY|SUBSCRIBE|\-OK|\-ERROR)',
re.IGNORECASE
)
GNTP_HEADER = re.compile('([\w-]+):(.+)')
GNTP_EOL = '\r\n'
class BaseError(Exception):
def gntp_error(self):
error = GNTPError(self.errorcode, self.errordesc)
return error.encode()
class ParseError(BaseError):
errorcode = 500
errordesc = 'Error parsing the message'
class AuthError(BaseError):
errorcode = 400
errordesc = 'Error with authorization'
class UnsupportedError(BaseError):
errorcode = 500
errordesc = 'Currently unsupported by gntp.py'
class _GNTPBuffer(StringIO.StringIO):
"""GNTP Buffer class"""
def writefmt(self, message="", *args):
"""Shortcut function for writing GNTP Headers"""
self.write((message % args).encode('utf8', 'replace'))
self.write(GNTP_EOL)
class _GNTPBase(object):
"""Base initilization
:param string messagetype: GNTP Message type
:param string version: GNTP Protocol version
:param string encription: Encryption protocol
"""
def __init__(self, messagetype=None, version='1.0', encryption=None):
self.info = {
'version': version,
'messagetype': messagetype,
'encryptionAlgorithmID': encryption
}
self.headers = {}
self.resources = {}
def __str__(self):
return self.encode()
def _parse_info(self, data):
"""Parse the first line of a GNTP message to get security and other info values
:param string data: GNTP Message
:return dict: Parsed GNTP Info line
"""
match = GNTP_INFO_LINE.match(data)
if not match:
raise ParseError('ERROR_PARSING_INFO_LINE')
info = match.groupdict()
if info['encryptionAlgorithmID'] == 'NONE':
info['encryptionAlgorithmID'] = None
return info
def set_password(self, password, encryptAlgo='MD5'):
"""Set a password for a GNTP Message
:param string password: Null to clear password
:param string encryptAlgo: Supports MD5, SHA1, SHA256, SHA512
"""
hash = {
'MD5': hashlib.md5,
'SHA1': hashlib.sha1,
'SHA256': hashlib.sha256,
'SHA512': hashlib.sha512,
}
self.password = password
self.encryptAlgo = encryptAlgo.upper()
if not password:
self.info['encryptionAlgorithmID'] = None
self.info['keyHashAlgorithm'] = None
return
if not self.encryptAlgo in hash.keys():
raise UnsupportedError('INVALID HASH "%s"' % self.encryptAlgo)
hashfunction = hash.get(self.encryptAlgo)
password = password.encode('utf8')
seed = time.ctime()
salt = hashfunction(seed).hexdigest()
saltHash = hashfunction(seed).digest()
keyBasis = password + saltHash
key = hashfunction(keyBasis).digest()
keyHash = hashfunction(key).hexdigest()
self.info['keyHashAlgorithmID'] = self.encryptAlgo
self.info['keyHash'] = keyHash.upper()
self.info['salt'] = salt.upper()
def _decode_hex(self, value):
"""Helper function to decode hex string to `proper` hex string
:param string value: Human readable hex string
:return string: Hex string
"""
result = ''
for i in range(0, len(value), 2):
tmp = int(value[i:i + 2], 16)
result += chr(tmp)
return result
def _decode_binary(self, rawIdentifier, identifier):
rawIdentifier += '\r\n\r\n'
dataLength = int(identifier['Length'])
pointerStart = self.raw.find(rawIdentifier) + len(rawIdentifier)
pointerEnd = pointerStart + dataLength
data = self.raw[pointerStart:pointerEnd]
if not len(data) == dataLength:
raise ParseError('INVALID_DATA_LENGTH Expected: %s Recieved %s' % (dataLength, len(data)))
return data
def _validate_password(self, password):
"""Validate GNTP Message against stored password"""
self.password = password
if password == None:
raise AuthError('Missing password')
keyHash = self.info.get('keyHash', None)
if keyHash is None and self.password is None:
return True
if keyHash is None:
raise AuthError('Invalid keyHash')
if self.password is None:
raise AuthError('Missing password')
password = self.password.encode('utf8')
saltHash = self._decode_hex(self.info['salt'])
keyBasis = password + saltHash
key = hashlib.md5(keyBasis).digest()
keyHash = hashlib.md5(key).hexdigest()
if not keyHash.upper() == self.info['keyHash'].upper():
raise AuthError('Invalid Hash')
return True
def validate(self):
"""Verify required headers"""
for header in self._requiredHeaders:
if not self.headers.get(header, False):
raise ParseError('Missing Notification Header: ' + header)
def _format_info(self):
"""Generate info line for GNTP Message
:return string:
"""
info = u'GNTP/%s %s' % (
self.info.get('version'),
self.info.get('messagetype'),
)
if self.info.get('encryptionAlgorithmID', None):
info += ' %s:%s' % (
self.info.get('encryptionAlgorithmID'),
self.info.get('ivValue'),
)
else:
info += ' NONE'
if self.info.get('keyHashAlgorithmID', None):
info += ' %s:%s.%s' % (
self.info.get('keyHashAlgorithmID'),
self.info.get('keyHash'),
self.info.get('salt')
)
return info
def _parse_dict(self, data):
"""Helper function to parse blocks of GNTP headers into a dictionary
:param string data:
:return dict:
"""
dict = {}
for line in data.split('\r\n'):
match = GNTP_HEADER.match(line)
if not match:
continue
key = unicode(match.group(1).strip(), 'utf8', 'replace')
val = unicode(match.group(2).strip(), 'utf8', 'replace')
dict[key] = val
return dict
def add_header(self, key, value):
if isinstance(value, unicode):
self.headers[key] = value
else:
self.headers[key] = unicode('%s' % value, 'utf8', 'replace')
def add_resource(self, data):
"""Add binary resource
:param string data: Binary Data
"""
identifier = hashlib.md5(data).hexdigest()
self.resources[identifier] = data
return 'x-growl-resource://%s' % identifier
def decode(self, data, password=None):
"""Decode GNTP Message
:param string data:
"""
self.password = password
self.raw = data
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(data)
self.headers = self._parse_dict(parts[0])
def encode(self):
"""Encode a generic GNTP Message
:return string: GNTP Message ready to be sent
"""
buffer = _GNTPBuffer()
buffer.writefmt(self._format_info())
#Headers
for k, v in self.headers.iteritems():
buffer.writefmt('%s: %s', k, v)
buffer.writefmt()
#Resources
for resource, data in self.resources.iteritems():
buffer.writefmt('Identifier: %s', resource)
buffer.writefmt('Length: %d', len(data))
buffer.writefmt()
buffer.write(data)
buffer.writefmt()
buffer.writefmt()
return buffer.getvalue()
class GNTPRegister(_GNTPBase):
"""Represents a GNTP Registration Command
:param string data: (Optional) See decode()
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Application-Name',
'Notifications-Count'
]
_requiredNotificationHeaders = ['Notification-Name']
def __init__(self, data=None, password=None):
_GNTPBase.__init__(self, 'REGISTER')
self.notifications = []
if data:
self.decode(data, password)
else:
self.set_password(password)
self.add_header('Application-Name', 'pygntp')
self.add_header('Notifications-Count', 0)
def validate(self):
'''Validate required headers and validate notification headers'''
for header in self._requiredHeaders:
if not self.headers.get(header, False):
raise ParseError('Missing Registration Header: ' + header)
for notice in self.notifications:
for header in self._requiredNotificationHeaders:
if not notice.get(header, False):
raise ParseError('Missing Notification Header: ' + header)
def decode(self, data, password):
"""Decode existing GNTP Registration message
:param string data: Message to decode
"""
self.raw = data
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(data)
self._validate_password(password)
self.headers = self._parse_dict(parts[0])
for i, part in enumerate(parts):
if i == 0:
continue # Skip Header
if part.strip() == '':
continue
notice = self._parse_dict(part)
if notice.get('Notification-Name', False):
self.notifications.append(notice)
elif notice.get('Identifier', False):
notice['Data'] = self._decode_binary(part, notice)
#open('register.png','wblol').write(notice['Data'])
self.resources[notice.get('Identifier')] = notice
def add_notification(self, name, enabled=True):
"""Add new Notification to Registration message
:param string name: Notification Name
:param boolean enabled: Enable this notification by default
"""
notice = {}
notice['Notification-Name'] = u'%s' % name
notice['Notification-Enabled'] = u'%s' % enabled
self.notifications.append(notice)
self.add_header('Notifications-Count', len(self.notifications))
def encode(self):
"""Encode a GNTP Registration Message
:return string: Encoded GNTP Registration message
"""
buffer = _GNTPBuffer()
buffer.writefmt(self._format_info())
#Headers
for k, v in self.headers.iteritems():
buffer.writefmt('%s: %s', k, v)
buffer.writefmt()
#Notifications
if len(self.notifications) > 0:
for notice in self.notifications:
for k, v in notice.iteritems():
buffer.writefmt('%s: %s', k, v)
buffer.writefmt()
#Resources
for resource, data in self.resources.iteritems():
buffer.writefmt('Identifier: %s', resource)
buffer.writefmt('Length: %d', len(data))
buffer.writefmt()
buffer.write(data)
buffer.writefmt()
buffer.writefmt()
return buffer.getvalue()
class GNTPNotice(_GNTPBase):
"""Represents a GNTP Notification Command
:param string data: (Optional) See decode()
:param string app: (Optional) Set Application-Name
:param string name: (Optional) Set Notification-Name
:param string title: (Optional) Set Notification Title
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Application-Name',
'Notification-Name',
'Notification-Title'
]
def __init__(self, data=None, app=None, name=None, title=None, password=None):
_GNTPBase.__init__(self, 'NOTIFY')
if data:
self.decode(data, password)
else:
self.set_password(password)
if app:
self.add_header('Application-Name', app)
if name:
self.add_header('Notification-Name', name)
if title:
self.add_header('Notification-Title', title)
def decode(self, data, password):
"""Decode existing GNTP Notification message
:param string data: Message to decode.
"""
self.raw = data
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(data)
self._validate_password(password)
self.headers = self._parse_dict(parts[0])
for i, part in enumerate(parts):
if i == 0:
continue # Skip Header
if part.strip() == '':
continue
notice = self._parse_dict(part)
if notice.get('Identifier', False):
notice['Data'] = self._decode_binary(part, notice)
#open('notice.png','wblol').write(notice['Data'])
self.resources[notice.get('Identifier')] = notice
class GNTPSubscribe(_GNTPBase):
"""Represents a GNTP Subscribe Command
:param string data: (Optional) See decode()
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Subscriber-ID',
'Subscriber-Name',
]
def __init__(self, data=None, password=None):
_GNTPBase.__init__(self, 'SUBSCRIBE')
if data:
self.decode(data, password)
else:
self.set_password(password)
class GNTPOK(_GNTPBase):
"""Represents a GNTP OK Response
:param string data: (Optional) See _GNTPResponse.decode()
:param string action: (Optional) Set type of action the OK Response is for
"""
_requiredHeaders = ['Response-Action']
def __init__(self, data=None, action=None):
_GNTPBase.__init__(self, '-OK')
if data:
self.decode(data)
if action:
self.add_header('Response-Action', action)
class GNTPError(_GNTPBase):
"""Represents a GNTP Error response
:param string data: (Optional) See _GNTPResponse.decode()
:param string errorcode: (Optional) Error code
:param string errordesc: (Optional) Error Description
"""
_requiredHeaders = ['Error-Code', 'Error-Description']
def __init__(self, data=None, errorcode=None, errordesc=None):
_GNTPBase.__init__(self, '-ERROR')
if data:
self.decode(data)
if errorcode:
self.add_header('Error-Code', errorcode)
self.add_header('Error-Description', errordesc)
def error(self):
return (self.headers.get('Error-Code', None),
self.headers.get('Error-Description', None))
def parse_gntp(data, password=None):
"""Attempt to parse a message as a GNTP message
:param string data: Message to be parsed
:param string password: Optional password to be used to verify the message
"""
match = GNTP_INFO_LINE_SHORT.match(data)
if not match:
raise ParseError('INVALID_GNTP_INFO')
info = match.groupdict()
if info['messagetype'] == 'REGISTER':
return GNTPRegister(data, password=password)
elif info['messagetype'] == 'NOTIFY':
return GNTPNotice(data, password=password)
elif info['messagetype'] == 'SUBSCRIBE':
return GNTPSubscribe(data, password=password)
elif info['messagetype'] == '-OK':
return GNTPOK(data)
elif info['messagetype'] == '-ERROR':
return GNTPError(data)
raise ParseError('INVALID_GNTP_MESSAGE')

141
gntp/cli.py Normal file
View File

@@ -0,0 +1,141 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
import logging
import os
import sys
from optparse import OptionParser, OptionGroup
from gntp.notifier import GrowlNotifier
from gntp.shim import RawConfigParser
from gntp.version import __version__
DEFAULT_CONFIG = os.path.expanduser('~/.gntp')
config = RawConfigParser({
'hostname': 'localhost',
'password': None,
'port': 23053,
})
config.read([DEFAULT_CONFIG])
if not config.has_section('gntp'):
config.add_section('gntp')
class ClientParser(OptionParser):
def __init__(self):
OptionParser.__init__(self, version="%%prog %s" % __version__)
group = OptionGroup(self, "Network Options")
group.add_option("-H", "--host",
dest="host", default=config.get('gntp', 'hostname'),
help="Specify a hostname to which to send a remote notification. [%default]")
group.add_option("--port",
dest="port", default=config.getint('gntp', 'port'), type="int",
help="port to listen on [%default]")
group.add_option("-P", "--password",
dest='password', default=config.get('gntp', 'password'),
help="Network password")
self.add_option_group(group)
group = OptionGroup(self, "Notification Options")
group.add_option("-n", "--name",
dest="app", default='Python GNTP Test Client',
help="Set the name of the application [%default]")
group.add_option("-s", "--sticky",
dest='sticky', default=False, action="store_true",
help="Make the notification sticky [%default]")
group.add_option("--image",
dest="icon", default=None,
help="Icon for notification (URL or /path/to/file)")
group.add_option("-m", "--message",
dest="message", default=None,
help="Sets the message instead of using stdin")
group.add_option("-p", "--priority",
dest="priority", default=0, type="int",
help="-2 to 2 [%default]")
group.add_option("-d", "--identifier",
dest="identifier",
help="Identifier for coalescing")
group.add_option("-t", "--title",
dest="title", default=None,
help="Set the title of the notification [%default]")
group.add_option("-N", "--notification",
dest="name", default='Notification',
help="Set the notification name [%default]")
group.add_option("--callback",
dest="callback",
help="URL callback")
self.add_option_group(group)
# Extra Options
self.add_option('-v', '--verbose',
dest='verbose', default=0, action='count',
help="Verbosity levels")
def parse_args(self, args=None, values=None):
values, args = OptionParser.parse_args(self, args, values)
if values.message is None:
print('Enter a message followed by Ctrl-D')
try:
message = sys.stdin.read()
except KeyboardInterrupt:
exit()
else:
message = values.message
if values.title is None:
values.title = ' '.join(args)
# If we still have an empty title, use the
# first bit of the message as the title
if values.title == '':
values.title = message[:20]
values.verbose = logging.WARNING - values.verbose * 10
return values, message
def main():
(options, message) = ClientParser().parse_args()
logging.basicConfig(level=options.verbose)
if not os.path.exists(DEFAULT_CONFIG):
logging.info('No config read found at %s', DEFAULT_CONFIG)
growl = GrowlNotifier(
applicationName=options.app,
notifications=[options.name],
defaultNotifications=[options.name],
hostname=options.host,
password=options.password,
port=options.port,
)
result = growl.register()
if result is not True:
exit(result)
# This would likely be better placed within the growl notifier
# class but until I make _checkIcon smarter this is "easier"
if options.icon and growl._checkIcon(options.icon) is False:
logging.info('Loading image %s', options.icon)
f = open(options.icon, 'rb')
options.icon = f.read()
f.close()
result = growl.notify(
noteType=options.name,
title=options.title,
description=message,
icon=options.icon,
sticky=options.sticky,
priority=options.priority,
callback=options.callback,
identifier=options.identifier,
)
if result is not True:
exit(result)
if __name__ == "__main__":
main()

77
gntp/config.py Normal file
View File

@@ -0,0 +1,77 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
"""
The gntp.config module is provided as an extended GrowlNotifier object that takes
advantage of the ConfigParser module to allow us to setup some default values
(such as hostname, password, and port) in a more global way to be shared among
programs using gntp
"""
import logging
import os
import gntp.notifier
import gntp.shim
__all__ = [
'mini',
'GrowlNotifier'
]
logger = logging.getLogger(__name__)
class GrowlNotifier(gntp.notifier.GrowlNotifier):
"""
ConfigParser enhanced GrowlNotifier object
For right now, we are only interested in letting users overide certain
values from ~/.gntp
::
[gntp]
hostname = ?
password = ?
port = ?
"""
def __init__(self, *args, **kwargs):
config = gntp.shim.RawConfigParser({
'hostname': kwargs.get('hostname', 'localhost'),
'password': kwargs.get('password'),
'port': kwargs.get('port', 23053),
})
config.read([os.path.expanduser('~/.gntp')])
# If the file does not exist, then there will be no gntp section defined
# and the config.get() lines below will get confused. Since we are not
# saving the config, it should be safe to just add it here so the
# code below doesn't complain
if not config.has_section('gntp'):
logger.info('Error reading ~/.gntp config file')
config.add_section('gntp')
kwargs['password'] = config.get('gntp', 'password')
kwargs['hostname'] = config.get('gntp', 'hostname')
kwargs['port'] = config.getint('gntp', 'port')
super(GrowlNotifier, self).__init__(*args, **kwargs)
def mini(description, **kwargs):
"""Single notification function
Simple notification function in one line. Has only one required parameter
and attempts to use reasonable defaults for everything else
:param string description: Notification message
"""
kwargs['notifierFactory'] = GrowlNotifier
gntp.notifier.mini(description, **kwargs)
if __name__ == '__main__':
# If we're running this module directly we're likely running it as a test
# so extra debugging is useful
logging.basicConfig(level=logging.INFO)
mini('Testing mini notification')

518
gntp/core.py Normal file
View File

@@ -0,0 +1,518 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
import hashlib
import re
import time
import gntp.shim
import gntp.errors as errors
__all__ = [
'GNTPRegister',
'GNTPNotice',
'GNTPSubscribe',
'GNTPOK',
'GNTPError',
'parse_gntp',
]
#GNTP/<version> <messagetype> <encryptionAlgorithmID>[:<ivValue>][ <keyHashAlgorithmID>:<keyHash>.<salt>]
GNTP_INFO_LINE = re.compile(
'GNTP/(?P<version>\d+\.\d+) (?P<messagetype>REGISTER|NOTIFY|SUBSCRIBE|\-OK|\-ERROR)' +
' (?P<encryptionAlgorithmID>[A-Z0-9]+(:(?P<ivValue>[A-F0-9]+))?) ?' +
'((?P<keyHashAlgorithmID>[A-Z0-9]+):(?P<keyHash>[A-F0-9]+).(?P<salt>[A-F0-9]+))?\r\n',
re.IGNORECASE
)
GNTP_INFO_LINE_SHORT = re.compile(
'GNTP/(?P<version>\d+\.\d+) (?P<messagetype>REGISTER|NOTIFY|SUBSCRIBE|\-OK|\-ERROR)',
re.IGNORECASE
)
GNTP_HEADER = re.compile('([\w-]+):(.+)')
GNTP_EOL = gntp.shim.b('\r\n')
GNTP_SEP = gntp.shim.b(': ')
class _GNTPBuffer(gntp.shim.StringIO):
"""GNTP Buffer class"""
def writeln(self, value=None):
if value:
self.write(gntp.shim.b(value))
self.write(GNTP_EOL)
def writeheader(self, key, value):
if not isinstance(value, str):
value = str(value)
self.write(gntp.shim.b(key))
self.write(GNTP_SEP)
self.write(gntp.shim.b(value))
self.write(GNTP_EOL)
class _GNTPBase(object):
"""Base initilization
:param string messagetype: GNTP Message type
:param string version: GNTP Protocol version
:param string encription: Encryption protocol
"""
def __init__(self, messagetype=None, version='1.0', encryption=None):
self.info = {
'version': version,
'messagetype': messagetype,
'encryptionAlgorithmID': encryption
}
self.hash_algo = {
'MD5': hashlib.md5,
'SHA1': hashlib.sha1,
'SHA256': hashlib.sha256,
'SHA512': hashlib.sha512,
}
self.headers = {}
self.resources = {}
# For Python2 we can just return the bytes as is without worry
# but on Python3 we want to make sure we return the packet as
# a unicode string so that things like logging won't get confused
if gntp.shim.PY2:
def __str__(self):
return self.encode()
else:
def __str__(self):
return gntp.shim.u(self.encode())
def _parse_info(self, data):
"""Parse the first line of a GNTP message to get security and other info values
:param string data: GNTP Message
:return dict: Parsed GNTP Info line
"""
match = GNTP_INFO_LINE.match(data)
if not match:
raise errors.ParseError('ERROR_PARSING_INFO_LINE')
info = match.groupdict()
if info['encryptionAlgorithmID'] == 'NONE':
info['encryptionAlgorithmID'] = None
return info
def set_password(self, password, encryptAlgo='MD5'):
"""Set a password for a GNTP Message
:param string password: Null to clear password
:param string encryptAlgo: Supports MD5, SHA1, SHA256, SHA512
"""
if not password:
self.info['encryptionAlgorithmID'] = None
self.info['keyHashAlgorithm'] = None
return
self.password = gntp.shim.b(password)
self.encryptAlgo = encryptAlgo.upper()
if not self.encryptAlgo in self.hash_algo:
raise errors.UnsupportedError('INVALID HASH "%s"' % self.encryptAlgo)
hashfunction = self.hash_algo.get(self.encryptAlgo)
password = password.encode('utf8')
seed = time.ctime().encode('utf8')
salt = hashfunction(seed).hexdigest()
saltHash = hashfunction(seed).digest()
keyBasis = password + saltHash
key = hashfunction(keyBasis).digest()
keyHash = hashfunction(key).hexdigest()
self.info['keyHashAlgorithmID'] = self.encryptAlgo
self.info['keyHash'] = keyHash.upper()
self.info['salt'] = salt.upper()
def _decode_hex(self, value):
"""Helper function to decode hex string to `proper` hex string
:param string value: Human readable hex string
:return string: Hex string
"""
result = ''
for i in range(0, len(value), 2):
tmp = int(value[i:i + 2], 16)
result += chr(tmp)
return result
def _decode_binary(self, rawIdentifier, identifier):
rawIdentifier += '\r\n\r\n'
dataLength = int(identifier['Length'])
pointerStart = self.raw.find(rawIdentifier) + len(rawIdentifier)
pointerEnd = pointerStart + dataLength
data = self.raw[pointerStart:pointerEnd]
if not len(data) == dataLength:
raise errors.ParseError('INVALID_DATA_LENGTH Expected: %s Recieved %s' % (dataLength, len(data)))
return data
def _validate_password(self, password):
"""Validate GNTP Message against stored password"""
self.password = password
if password is None:
raise errors.AuthError('Missing password')
keyHash = self.info.get('keyHash', None)
if keyHash is None and self.password is None:
return True
if keyHash is None:
raise errors.AuthError('Invalid keyHash')
if self.password is None:
raise errors.AuthError('Missing password')
keyHashAlgorithmID = self.info.get('keyHashAlgorithmID','MD5')
password = self.password.encode('utf8')
saltHash = self._decode_hex(self.info['salt'])
keyBasis = password + saltHash
self.key = self.hash_algo[keyHashAlgorithmID](keyBasis).digest()
keyHash = self.hash_algo[keyHashAlgorithmID](self.key).hexdigest()
if not keyHash.upper() == self.info['keyHash'].upper():
raise errors.AuthError('Invalid Hash')
return True
def validate(self):
"""Verify required headers"""
for header in self._requiredHeaders:
if not self.headers.get(header, False):
raise errors.ParseError('Missing Notification Header: ' + header)
def _format_info(self):
"""Generate info line for GNTP Message
:return string:
"""
info = 'GNTP/%s %s' % (
self.info.get('version'),
self.info.get('messagetype'),
)
if self.info.get('encryptionAlgorithmID', None):
info += ' %s:%s' % (
self.info.get('encryptionAlgorithmID'),
self.info.get('ivValue'),
)
else:
info += ' NONE'
if self.info.get('keyHashAlgorithmID', None):
info += ' %s:%s.%s' % (
self.info.get('keyHashAlgorithmID'),
self.info.get('keyHash'),
self.info.get('salt')
)
return info
def _parse_dict(self, data):
"""Helper function to parse blocks of GNTP headers into a dictionary
:param string data:
:return dict: Dictionary of parsed GNTP Headers
"""
d = {}
for line in data.split('\r\n'):
match = GNTP_HEADER.match(line)
if not match:
continue
key = match.group(1).strip()
val = match.group(2).strip()
d[key] = val
return d
def add_header(self, key, value):
self.headers[key] = value
def add_resource(self, data):
"""Add binary resource
:param string data: Binary Data
"""
data = gntp.shim.b(data)
identifier = hashlib.md5(data).hexdigest()
self.resources[identifier] = data
return 'x-growl-resource://%s' % identifier
def decode(self, data, password=None):
"""Decode GNTP Message
:param string data:
"""
self.password = password
self.raw = gntp.shim.u(data)
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(self.raw)
self.headers = self._parse_dict(parts[0])
def encode(self):
"""Encode a generic GNTP Message
:return string: GNTP Message ready to be sent. Returned as a byte string
"""
buff = _GNTPBuffer()
buff.writeln(self._format_info())
#Headers
for k, v in self.headers.items():
buff.writeheader(k, v)
buff.writeln()
#Resources
for resource, data in self.resources.items():
buff.writeheader('Identifier', resource)
buff.writeheader('Length', len(data))
buff.writeln()
buff.write(data)
buff.writeln()
buff.writeln()
return buff.getvalue()
class GNTPRegister(_GNTPBase):
"""Represents a GNTP Registration Command
:param string data: (Optional) See decode()
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Application-Name',
'Notifications-Count'
]
_requiredNotificationHeaders = ['Notification-Name']
def __init__(self, data=None, password=None):
_GNTPBase.__init__(self, 'REGISTER')
self.notifications = []
if data:
self.decode(data, password)
else:
self.set_password(password)
self.add_header('Application-Name', 'pygntp')
self.add_header('Notifications-Count', 0)
def validate(self):
'''Validate required headers and validate notification headers'''
for header in self._requiredHeaders:
if not self.headers.get(header, False):
raise errors.ParseError('Missing Registration Header: ' + header)
for notice in self.notifications:
for header in self._requiredNotificationHeaders:
if not notice.get(header, False):
raise errors.ParseError('Missing Notification Header: ' + header)
def decode(self, data, password):
"""Decode existing GNTP Registration message
:param string data: Message to decode
"""
self.raw = gntp.shim.u(data)
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(self.raw)
self._validate_password(password)
self.headers = self._parse_dict(parts[0])
for i, part in enumerate(parts):
if i == 0:
continue # Skip Header
if part.strip() == '':
continue
notice = self._parse_dict(part)
if notice.get('Notification-Name', False):
self.notifications.append(notice)
elif notice.get('Identifier', False):
notice['Data'] = self._decode_binary(part, notice)
#open('register.png','wblol').write(notice['Data'])
self.resources[notice.get('Identifier')] = notice
def add_notification(self, name, enabled=True):
"""Add new Notification to Registration message
:param string name: Notification Name
:param boolean enabled: Enable this notification by default
"""
notice = {}
notice['Notification-Name'] = name
notice['Notification-Enabled'] = enabled
self.notifications.append(notice)
self.add_header('Notifications-Count', len(self.notifications))
def encode(self):
"""Encode a GNTP Registration Message
:return string: Encoded GNTP Registration message. Returned as a byte string
"""
buff = _GNTPBuffer()
buff.writeln(self._format_info())
#Headers
for k, v in self.headers.items():
buff.writeheader(k, v)
buff.writeln()
#Notifications
if len(self.notifications) > 0:
for notice in self.notifications:
for k, v in notice.items():
buff.writeheader(k, v)
buff.writeln()
#Resources
for resource, data in self.resources.items():
buff.writeheader('Identifier', resource)
buff.writeheader('Length', len(data))
buff.writeln()
buff.write(data)
buff.writeln()
buff.writeln()
return buff.getvalue()
class GNTPNotice(_GNTPBase):
"""Represents a GNTP Notification Command
:param string data: (Optional) See decode()
:param string app: (Optional) Set Application-Name
:param string name: (Optional) Set Notification-Name
:param string title: (Optional) Set Notification Title
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Application-Name',
'Notification-Name',
'Notification-Title'
]
def __init__(self, data=None, app=None, name=None, title=None, password=None):
_GNTPBase.__init__(self, 'NOTIFY')
if data:
self.decode(data, password)
else:
self.set_password(password)
if app:
self.add_header('Application-Name', app)
if name:
self.add_header('Notification-Name', name)
if title:
self.add_header('Notification-Title', title)
def decode(self, data, password):
"""Decode existing GNTP Notification message
:param string data: Message to decode.
"""
self.raw = gntp.shim.u(data)
parts = self.raw.split('\r\n\r\n')
self.info = self._parse_info(self.raw)
self._validate_password(password)
self.headers = self._parse_dict(parts[0])
for i, part in enumerate(parts):
if i == 0:
continue # Skip Header
if part.strip() == '':
continue
notice = self._parse_dict(part)
if notice.get('Identifier', False):
notice['Data'] = self._decode_binary(part, notice)
#open('notice.png','wblol').write(notice['Data'])
self.resources[notice.get('Identifier')] = notice
class GNTPSubscribe(_GNTPBase):
"""Represents a GNTP Subscribe Command
:param string data: (Optional) See decode()
:param string password: (Optional) Password to use while encoding/decoding messages
"""
_requiredHeaders = [
'Subscriber-ID',
'Subscriber-Name',
]
def __init__(self, data=None, password=None):
_GNTPBase.__init__(self, 'SUBSCRIBE')
if data:
self.decode(data, password)
else:
self.set_password(password)
class GNTPOK(_GNTPBase):
"""Represents a GNTP OK Response
:param string data: (Optional) See _GNTPResponse.decode()
:param string action: (Optional) Set type of action the OK Response is for
"""
_requiredHeaders = ['Response-Action']
def __init__(self, data=None, action=None):
_GNTPBase.__init__(self, '-OK')
if data:
self.decode(data)
if action:
self.add_header('Response-Action', action)
class GNTPError(_GNTPBase):
"""Represents a GNTP Error response
:param string data: (Optional) See _GNTPResponse.decode()
:param string errorcode: (Optional) Error code
:param string errordesc: (Optional) Error Description
"""
_requiredHeaders = ['Error-Code', 'Error-Description']
def __init__(self, data=None, errorcode=None, errordesc=None):
_GNTPBase.__init__(self, '-ERROR')
if data:
self.decode(data)
if errorcode:
self.add_header('Error-Code', errorcode)
self.add_header('Error-Description', errordesc)
def error(self):
return (self.headers.get('Error-Code', None),
self.headers.get('Error-Description', None))
def parse_gntp(data, password=None):
"""Attempt to parse a message as a GNTP message
:param string data: Message to be parsed
:param string password: Optional password to be used to verify the message
"""
data = gntp.shim.u(data)
match = GNTP_INFO_LINE_SHORT.match(data)
if not match:
raise errors.ParseError('INVALID_GNTP_INFO')
info = match.groupdict()
if info['messagetype'] == 'REGISTER':
return GNTPRegister(data, password=password)
elif info['messagetype'] == 'NOTIFY':
return GNTPNotice(data, password=password)
elif info['messagetype'] == 'SUBSCRIBE':
return GNTPSubscribe(data, password=password)
elif info['messagetype'] == '-OK':
return GNTPOK(data)
elif info['messagetype'] == '-ERROR':
return GNTPError(data)
raise errors.ParseError('INVALID_GNTP_MESSAGE')

25
gntp/errors.py Normal file
View File

@@ -0,0 +1,25 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
class BaseError(Exception):
pass
class ParseError(BaseError):
errorcode = 500
errordesc = 'Error parsing the message'
class AuthError(BaseError):
errorcode = 400
errordesc = 'Error with authorization'
class UnsupportedError(BaseError):
errorcode = 500
errordesc = 'Currently unsupported by gntp.py'
class NetworkError(BaseError):
errorcode = 500
errordesc = "Error connecting to growl server"

View File

@@ -1,3 +1,6 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
"""
The gntp.notifier module is provided as a simple way to send notifications
using GNTP
@@ -9,10 +12,15 @@ using GNTP
`Original Python bindings <http://code.google.com/p/growl/source/browse/Bindings/python/Growl.py>`_
"""
import gntp
import socket
import logging
import platform
import socket
import sys
from gntp.version import __version__
import gntp.core
import gntp.errors as errors
import gntp.shim
__all__ = [
'mini',
@@ -22,45 +30,6 @@ __all__ = [
logger = logging.getLogger(__name__)
def mini(description, applicationName='PythonMini', noteType="Message",
title="Mini Message", applicationIcon=None, hostname='localhost',
password=None, port=23053, sticky=False, priority=None,
callback=None, notificationIcon=None, identifier=None):
"""Single notification function
Simple notification function in one line. Has only one required parameter
and attempts to use reasonable defaults for everything else
:param string description: Notification message
.. warning::
For now, only URL callbacks are supported. In the future, the
callback argument will also support a function
"""
growl = GrowlNotifier(
applicationName=applicationName,
notifications=[noteType],
defaultNotifications=[noteType],
applicationIcon=applicationIcon,
hostname=hostname,
password=password,
port=port,
)
result = growl.register()
if result is not True:
return result
return growl.notify(
noteType=noteType,
title=title,
description=description,
icon=notificationIcon,
sticky=sticky,
priority=priority,
callback=callback,
identifier=identifier,
)
class GrowlNotifier(object):
"""Helper class to simplfy sending Growl messages
@@ -99,8 +68,9 @@ class GrowlNotifier(object):
If it's a simple URL icon, then we return True. If it's a data icon
then we return False
'''
logger.debug('Checking icon')
return data.startswith('http')
logger.info('Checking icon')
return gntp.shim.u(data)[:4] in ['http', 'file']
def register(self):
"""Send GNTP Registration
@@ -109,8 +79,8 @@ class GrowlNotifier(object):
Before sending notifications to Growl, you need to have
sent a registration message at least once
"""
logger.debug('Sending registration to %s:%s', self.hostname, self.port)
register = gntp.GNTPRegister()
logger.info('Sending registration to %s:%s', self.hostname, self.port)
register = gntp.core.GNTPRegister()
register.add_header('Application-Name', self.applicationName)
for notification in self.notifications:
enabled = notification in self.defaultNotifications
@@ -119,8 +89,8 @@ class GrowlNotifier(object):
if self._checkIcon(self.applicationIcon):
register.add_header('Application-Icon', self.applicationIcon)
else:
id = register.add_resource(self.applicationIcon)
register.add_header('Application-Icon', id)
resource = register.add_resource(self.applicationIcon)
register.add_header('Application-Icon', resource)
if self.password:
register.set_password(self.password, self.passwordHash)
self.add_origin_info(register)
@@ -128,7 +98,7 @@ class GrowlNotifier(object):
return self._send('register', register)
def notify(self, noteType, title, description, icon=None, sticky=False,
priority=None, callback=None, identifier=None):
priority=None, callback=None, identifier=None, custom={}):
"""Send a GNTP notifications
.. warning::
@@ -141,14 +111,16 @@ class GrowlNotifier(object):
:param boolean sticky: Sticky notification
:param integer priority: Message priority level from -2 to 2
:param string callback: URL callback
:param dict custom: Custom attributes. Key names should be prefixed with X-
according to the spec but this is not enforced by this class
.. warning::
For now, only URL callbacks are supported. In the future, the
callback argument will also support a function
"""
logger.debug('Sending notification [%s] to %s:%s', noteType, self.hostname, self.port)
logger.info('Sending notification [%s] to %s:%s', noteType, self.hostname, self.port)
assert noteType in self.notifications
notice = gntp.GNTPNotice()
notice = gntp.core.GNTPNotice()
notice.add_header('Application-Name', self.applicationName)
notice.add_header('Notification-Name', noteType)
notice.add_header('Notification-Title', title)
@@ -162,8 +134,8 @@ class GrowlNotifier(object):
if self._checkIcon(icon):
notice.add_header('Notification-Icon', icon)
else:
id = notice.add_resource(icon)
notice.add_header('Notification-Icon', id)
resource = notice.add_resource(icon)
notice.add_header('Notification-Icon', resource)
if description:
notice.add_header('Notification-Text', description)
@@ -172,6 +144,9 @@ class GrowlNotifier(object):
if identifier:
notice.add_header('Notification-Coalescing-ID', identifier)
for key in custom:
notice.add_header(key, custom[key])
self.add_origin_info(notice)
self.notify_hook(notice)
@@ -179,7 +154,7 @@ class GrowlNotifier(object):
def subscribe(self, id, name, port):
"""Send a Subscribe request to a remote machine"""
sub = gntp.GNTPSubscribe()
sub = gntp.core.GNTPSubscribe()
sub.add_header('Subscriber-ID', id)
sub.add_header('Subscriber-Name', name)
sub.add_header('Subscriber-Port', port)
@@ -195,7 +170,7 @@ class GrowlNotifier(object):
"""Add optional Origin headers to message"""
packet.add_header('Origin-Machine-Name', platform.node())
packet.add_header('Origin-Software-Name', 'gntp.py')
packet.add_header('Origin-Software-Version', gntp.__version__)
packet.add_header('Origin-Software-Version', __version__)
packet.add_header('Origin-Platform-Name', platform.system())
packet.add_header('Origin-Platform-Version', platform.platform())
@@ -214,34 +189,78 @@ class GrowlNotifier(object):
packet.validate()
data = packet.encode()
#logger.debug('To : %s:%s <%s>\n%s', self.hostname, self.port, packet.__class__, data)
#Less verbose
logger.debug('To : %s:%s <%s>', self.hostname, self.port, packet.__class__)
logger.debug('To : %s:%s <%s>\n%s', self.hostname, self.port, packet.__class__, data)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(self.socketTimeout)
s.connect((self.hostname, self.port))
s.send(data)
recv_data = s.recv(1024)
while not recv_data.endswith("\r\n\r\n"):
recv_data += s.recv(1024)
response = gntp.parse_gntp(recv_data)
try:
s.connect((self.hostname, self.port))
s.send(data)
recv_data = s.recv(1024)
while not recv_data.endswith(gntp.shim.b("\r\n\r\n")):
recv_data += s.recv(1024)
except socket.error:
# Python2.5 and Python3 compatibile exception
exc = sys.exc_info()[1]
raise errors.NetworkError(exc)
response = gntp.core.parse_gntp(recv_data)
s.close()
#logger.debug('From : %s:%s <%s>\n%s', self.hostname, self.port, response.__class__, response)
#Less verbose
logger.debug('From : %s:%s <%s>', self.hostname, self.port, response.__class__)
logger.debug('From : %s:%s <%s>\n%s', self.hostname, self.port, response.__class__, response)
if type(response) == gntp.GNTPOK:
return True
if response.error()[0] == '404' and 'disabled' in response.error()[1]:
# Ignore message saying that user has disabled this class
if type(response) == gntp.core.GNTPOK:
return True
logger.error('Invalid response: %s', response.error())
return response.error()
def mini(description, applicationName='PythonMini', noteType="Message",
title="Mini Message", applicationIcon=None, hostname='localhost',
password=None, port=23053, sticky=False, priority=None,
callback=None, notificationIcon=None, identifier=None,
notifierFactory=GrowlNotifier):
"""Single notification function
Simple notification function in one line. Has only one required parameter
and attempts to use reasonable defaults for everything else
:param string description: Notification message
.. warning::
For now, only URL callbacks are supported. In the future, the
callback argument will also support a function
"""
try:
growl = notifierFactory(
applicationName=applicationName,
notifications=[noteType],
defaultNotifications=[noteType],
applicationIcon=applicationIcon,
hostname=hostname,
password=password,
port=port,
)
result = growl.register()
if result is not True:
return result
return growl.notify(
noteType=noteType,
title=title,
description=description,
icon=notificationIcon,
sticky=sticky,
priority=priority,
callback=callback,
identifier=identifier,
)
except Exception:
# We want the "mini" function to be simple and swallow Exceptions
# in order to be less invasive
logger.exception("Growl error")
if __name__ == '__main__':
# If we're running this module directly we're likely running it as a test
# so extra debugging is useful
logging.basicConfig(level=logging.DEBUG)
logging.basicConfig(level=logging.INFO)
mini('Testing mini notification')

46
gntp/shim.py Normal file
View File

@@ -0,0 +1,46 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
"""
Python2.5 and Python3.3 compatibility shim
Heavily inspirted by the "six" library.
https://pypi.python.org/pypi/six
"""
import sys
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
if PY3:
def b(s):
if isinstance(s, bytes):
return s
return s.encode('utf8', 'replace')
def u(s):
if isinstance(s, bytes):
return s.decode('utf8', 'replace')
return s
from io import BytesIO as StringIO
from configparser import RawConfigParser
else:
def b(s):
if isinstance(s, unicode):
return s.encode('utf8', 'replace')
return s
def u(s):
if isinstance(s, unicode):
return s
if isinstance(s, int):
s = str(s)
return unicode(s, "utf8", "replace")
from StringIO import StringIO
from ConfigParser import RawConfigParser
b.__doc__ = "Ensure we have a byte string"
u.__doc__ = "Ensure we have a unicode string"

4
gntp/version.py Normal file
View File

@@ -0,0 +1,4 @@
# Copyright: 2013 Paul Traylor
# These sources are released under the terms of the MIT license: see LICENSE
__version__ = '1.0.3'

View File

@@ -13,6 +13,18 @@
<!--#end for#-->
<!--#end def#-->
<!--#def show_cat_box($section_label)#-->
<div class="col2-cats" <!--#if int($getVar($section_label + '_enable')) > 0 then '' else 'style="display:none"'#-->>
<hr>
<b>$T('affectedCat')</b><br/>
<select name="${section_label}_cats" multiple="multiple" class="multiple_cats">
<!--#for $ct in $categories#-->
<option value="$ct" <!--#if $ct in $getVar($section_label + '_cats') then 'selected="selected"' else ""#-->>$Tspec($ct)</option>
<!--#end for#-->
</select>
</div>
<!--#end def#-->
<div class="colmask">
<form action="saveEmail" method="post" name="fullform" class="fullform" autocomplete="off" novalidate>
<input type="hidden" id="session" name="session" value="$session" />
@@ -20,7 +32,15 @@
<div class="section" id="email">
<div class="col2">
<h3>$T('cmenu-email') <a href="$helpuri$help_uri#toc0" target="_blank"><span class="glyphicon glyphicon-question-sign"></span></a></h3>
</div><!-- /col2 -->
<div class="col2-cats" <!--#if int($email_endjob) > 0 then '' else 'style="display:none"'#-->>
<b>$T('affectedCat')</b><br/>
<select name="email_cats" multiple="multiple" class="multiple_cats">
<!--#for $ct in $categories#-->
<option value="$ct" <!--#if $ct in $email_cats then 'selected="selected"' else ""#-->>$Tspec($ct)</option>
<!--#end for#-->
</select>
</div>
</div>
<div class="col1">
<fieldset>
<div class="field-pair">
@@ -79,8 +99,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<!--#if $have_ncenter#-->
<div class="section">
<div class="col2">
@@ -91,7 +111,7 @@
<td><label for="ncenter_enable"> $T('opt-ncenter_enable')</label></td>
</tr>
</table>
</div><!-- /col2 -->
</div>
<div class="col1" <!--#if int($ncenter_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
$show_notify_checkboxes('ncenter')
@@ -103,8 +123,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<!--#end if#-->
<!--#if $nt#-->
<div class="section">
@@ -116,7 +136,8 @@
<td><label for="acenter_enable"> $T('opt-acenter_enable')</label></td>
</tr>
</table>
</div><!-- /col2 -->
$show_cat_box('acenter')
</div>
<div class="col1" <!--#if int($acenter_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
$show_notify_checkboxes('acenter')
@@ -128,8 +149,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<!--#end if#-->
<!--#if $have_ntfosd#-->
<div class="section">
@@ -141,7 +162,8 @@
<td><label for="ntfosd_enable"> $T('opt-ntfosd_enable')</label></td>
</tr>
</table>
</div><!-- /col2 -->
$show_cat_box('ntfosd')
</div>
<div class="col1" <!--#if int($ntfosd_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
$show_notify_checkboxes('ntfosd')
@@ -153,8 +175,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<!--#end if#-->
<div class="section" id="growl">
<div class="col2">
@@ -165,7 +187,8 @@
<td><label for="growl_enable"> $T('opt-growl_enable')</label></td>
</tr>
</table>
</div><!-- /col2 -->
$show_cat_box('growl')
</div>
<div class="col1" <!--#if int($growl_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
<div class="field-pair">
@@ -187,8 +210,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<div class="section" id="prowl">
<div class="col2">
<h3>$T('section-Prowl')</h3>
@@ -199,7 +222,8 @@
</tr>
</table>
<em>$T('explain-prowl_enable')</em>
</div><!-- /col2 -->
$show_cat_box('prowl')
</div>
<div class="col1" <!--#if int($prowl_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
<div class="field-pair">
@@ -231,8 +255,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<div class="section" id="pushover">
<div class="col2">
@@ -244,7 +268,8 @@
</tr>
</table>
<em>$T('explain-pushover_enable')</em>
</div><!-- /col2 -->
$show_cat_box('pushover')
</div>
<div class="col1" <!--#if int($pushover_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
<div class="field-pair">
@@ -286,8 +311,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<div class="section" id="pushbullet">
<div class="col2">
<h3>$T('section-Pushbullet')</h3>
@@ -298,7 +323,8 @@
</tr>
</table>
<em>$T('explain-pushbullet_enable')</em>
</div><!-- /col2 -->
$show_cat_box('pushbullet')
</div>
<div class="col1" <!--#if int($pushbullet_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
<div class="field-pair">
@@ -322,19 +348,20 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
<div class="section" id="nscript">
<div class="col2">
<h3>$T('section-NScript')</h3>
<table>
<tr>
<td><input type="checkbox" name="nscript_enable" id="nscript_enable" value="1" <!--#if int($nscript_enable) > 0 then 'checked="checked"' else ""#--> /></td>
<td><label for="nscript_enable"> $T('opt-nscript_enable')</label></td>
</tr>
</table>
<em>$T('explain-nscript_enable')</em>
</div><!-- /col2 -->
<h3>$T('section-NScript')</h3>
<table>
<tr>
<td><input type="checkbox" name="nscript_enable" id="nscript_enable" value="1" <!--#if int($nscript_enable) > 0 then 'checked="checked"' else ""#--> /></td>
<td><label for="nscript_enable"> $T('opt-nscript_enable')</label></td>
</tr>
</table>
<em>$T('explain-nscript_enable')</em>
$show_cat_box('nscript')
</div>
<div class="col1" <!--#if int($nscript_enable) > 0 then '' else 'style="display:none"'#-->>
<fieldset>
<div class="field-pair">
@@ -360,8 +387,8 @@
<div class="alert"></div>
</div>
</fieldset>
</div><!-- /col1 -->
</div><!-- /section -->
</div>
</div>
</form>
</div><!-- /colmask -->
@@ -374,11 +401,20 @@
\$('.col2 input[name$="enable"]').change(function() {
if(this.checked) {
\$(this).parents('.section').find('.col1').show()
\$(this).parents('.col2').find('.col2-cats').show()
} else {
\$(this).parents('.section').find('.col1').hide()
\$(this).parents('.col2').find('.col2-cats').hide()
}
\$('form').submit()
})
\$('#email_endjob').change(function() {
if(\$(this).val() > 0) {
\$(this).parents('.section').find('.col2-cats').show()
} else {
\$(this).parents('.section').find('.col2-cats').hide()
}
})
/**
Testing functions

View File

@@ -130,6 +130,11 @@
<input type="checkbox" name="auto_sort" id="auto_sort" value="1" <!--#if int($auto_sort) > 0 then 'checked="checked"' else ""#--> />
<span class="desc">$T('explain-auto_sort')</span>
</div>
<div class="field-pair">
<label class="config" for="direct_unpack">$T('opt-direct_unpack')</label>
<input type="checkbox" name="direct_unpack" id="direct_unpack" value="1" <!--#if int($direct_unpack) > 0 then 'checked="checked"' else ""#--> />
<span class="desc">$T('explain-direct_unpack')</span>
</div>
<div class="field-pair">
<button class="btn btn-default saveButton"><span class="glyphicon glyphicon-ok"></span> $T('button-saveChanges')</button>
<button class="btn btn-default restoreDefaults"><span class="glyphicon glyphicon-asterisk"></span> $T('button-restoreDefaults')</button>

View File

@@ -137,7 +137,8 @@ input[type="checkbox"]+.desc {
font-style: italic;
padding: 0 1px;
}
.col2 p {
.col2 p,
.col2-cats {
font-size: 12px;
color: #666;
margin: 1em 0;

View File

@@ -47,7 +47,7 @@
<!-- ko if: historyStatus.has_rating -->
<div class="dropdown history-ratings">
<a href="#" class="name-ratings hover-button" data-toggle="dropdown" onclick="keepOpen(this)">
<a href="#" class="name-icons hover-button" data-toggle="dropdown" onclick="keepOpen(this)">
<span class="glyphicon glyphicon-facetime-video"></span> <span data-bind="text: historyStatus.rating_avg_video"></span>
<span class="glyphicon glyphicon-volume-up"></span> <span data-bind="text: historyStatus.rating_avg_audio"></span>
</a>

View File

@@ -525,10 +525,14 @@
<div class="progress-bar progress-bar-info" data-bind="attr: { 'style': 'width: '+percentage()+'; background-color: ' + \$parent.filelist.currentItem.progressColor() + ';' }">
<input type="checkbox" data-bind="attr: { 'name' : nzf_id }, disable: !canselect(), click : \$parent.filelist.checkSelectRange" title="$T('Glitter-multiSelect')" />
<strong data-bind="text: percentage"></strong>
<span>
<div class="fileDetails">
<span data-bind="truncatedTextCenter: filename"></span>
<div class="fileControls">
<a href="#" data-bind="click: \$parent.filelist.moveButton" class="hover-button buttonMoveToTop" title="$T('Glitter-top')"><span class="glyphicon glyphicon-chevron-up"></span></a>
<a href="#" data-bind="click: \$parent.filelist.moveButton" class="hover-button buttonMoveToBottom" title="$T('Glitter-bottom')"><span class="glyphicon glyphicon-chevron-down"></span></a>
</div>
<small>(<span data-bind="text: file_age"></span> - <span data-bind="text: mb"></span> MB)</small>
</span>
</div>
</div>
</div>
</td>

View File

@@ -95,17 +95,16 @@
<span data-bind="text: password"></span>
</small>
<!-- /ko -->
<!-- ko if: (rating_avg_video() !== false) -->
<div class="name-ratings hover-button">
<span class="glyphicon glyphicon-facetime-video"></span> <span data-bind="text: rating_avg_video"></span>
<span class="glyphicon glyphicon-volume-up"></span> <span data-bind="text: rating_avg_audio"></span>
<div class="name-icons direct-unpack hover-button" data-bind="visible: direct_unpack">
<span class="glyphicon glyphicon-compressed"></span> <span data-bind="text: direct_unpack"></span>
</div>
<!-- /ko -->
</div>
<form data-bind="submit: editingNameSubmit">
<input type="text" data-bind="value: nameForEdit, visible: editingName(), hasfocus: editingName" />
</form>
<div class="name-options" data-bind="visible: !editingName()">
<a href="#" data-bind="click: \$parent.queue.moveButton" class="hover-button buttonMoveToTop" title="$T('Glitter-MoveToTop')"><span class="glyphicon glyphicon-chevron-up"></span></a>
<a href="#" data-bind="click: \$parent.queue.moveButton" class="hover-button buttonMoveToBottom" title="$T('Glitter-MoveToBottom')"><span class="glyphicon glyphicon-chevron-down"></span></a>
<a href="#" data-bind="click: editName, css: { disabled: isGrabbing() }" class="hover-button"><span class="glyphicon glyphicon-pencil"></span></a>
<a href="#" data-bind="click: showFiles, css: { disabled: isGrabbing() }" class="hover-button" title="$T('nzoDetails') - $T('srv-password')"><span class="glyphicon glyphicon-folder-open"></span></a>
<small data-bind="text: avg_age"></small>

View File

@@ -35,6 +35,39 @@ function Fileslisting(parent) {
})
}
// Move to top and bottom buttons
self.moveButton = function (item,event) {
var ITEMKEY = "ko_sortItem",
INDEXKEY = "ko_sourceIndex",
LISTKEY = "ko_sortList",
PARENTKEY = "ko_parentList",
DRAGKEY = "ko_dragItem",
unwrap = ko.utils.unwrapObservable,
dataGet = ko.utils.domData.get,
dataSet = ko.utils.domData.set;
var targetRow,sourceRow,tbody;
sourceRow = $(event.currentTarget).parents("tr").filter(":first");
tbody = sourceRow.parents("tbody").filter(":first");
//debugger;
dataSet(sourceRow[0], INDEXKEY, ko.utils.arrayIndexOf(sourceRow.parent().children(), sourceRow[0]));
sourceRow = sourceRow.detach();
if ($(event.currentTarget).is(".buttonMoveToTop")) {
// we are moving to the top
targetRow = tbody.children(".files-done").filter(":last");
} else {
//we are moving to the bottom
targetRow = tbody.children(".files-sortable").filter(":last");
}
if(targetRow.length < 1 ){
// we found an edge case and need to do something special
targetRow = tbody.children(".files-sortable").filter(":first");
sourceRow.insertBefore(targetRow[0]);
} else {
sourceRow.insertAfter($(targetRow[0]));
}
tbody.sortable('option', 'update').call(tbody[0],null, { item: sourceRow });
};
// Trigger update
self.triggerUpdate = function() {
// Call API

View File

@@ -328,16 +328,14 @@ function HistoryModel(parent, data) {
case 'speed':
// Anything to calculate?
if(self.historyStatus.bytes() > 0 && self.historyStatus.download_time() > 0) {
var theSpeed = self.historyStatus.bytes()/self.historyStatus.download_time();
theSpeed = theSpeed/1024;
// MB/s or KB/s
if(theSpeed > 1024) {
theSpeed = theSpeed/1024;
return theSpeed.toFixed(1) + ' MB/s'
} else {
return Math.round(theSpeed) + ' KB/s'
}
try {
// Extract the Download section
var downloadLog = ko.utils.arrayFirst(self.historyStatus.stage_log(), function(item) {
return item.name() == 'Download'
});
// Extract the speed
return downloadLog.actions()[0].match(/(\S*\s\S+)(?=<br\/>)/)[0]
} catch(err) { }
}
return;
case 'category':

View File

@@ -148,7 +148,6 @@ function QueueListModel(parent) {
// See what the actual index is of the queue-object
// This way we can see how we move up and down independent of pagination
var itemReplaced = self.queueItems()[event.targetIndex+corTerm];
callAPI({
mode: "switch",
value: itemMoved.id,
@@ -156,6 +155,25 @@ function QueueListModel(parent) {
}).then(self.parent.refresh);
};
// Move button clicked
self.moveButton = function(event,ui) {
var itemMoved = event;
var targetIndex;
if($(ui.currentTarget).is(".buttonMoveToTop")){
//we want to move to the top
targetIndex = 0;
} else {
// we want to move to the bottom
targetIndex = self.totalItems() - 1;
}
callAPI({
mode: "switch",
value: itemMoved.id,
value2: targetIndex
}).then(self.parent.refresh);
}
// Save pagination state
self.paginationLimit.subscribe(function(newValue) {
// Save in config if global
@@ -465,6 +483,7 @@ function QueueModel(parent, data) {
self.remainingMB = ko.observable(parseFloat(data.mbleft));
self.avg_age = ko.observable(data.avg_age)
self.missing = ko.observable(parseFloat(data.mbmissing))
self.direct_unpack = ko.observable(data.direct_unpack)
self.category = ko.observable(data.cat);
self.priority = ko.observable(parent.priorityName[data.priority]);
self.script = ko.observable(data.script);
@@ -476,8 +495,6 @@ function QueueModel(parent, data) {
self.nameForEdit = ko.observable();
self.editingName = ko.observable(false);
self.hasDropdown = ko.observable(false);
self.rating_avg_video = ko.observable(false)
self.rating_avg_audio = ko.observable(false)
// Color of the progress bar
self.progressColor = ko.computed(function() {
@@ -566,18 +583,13 @@ function QueueModel(parent, data) {
self.remainingMB(parseFloat(data.mbleft));
self.avg_age(data.avg_age)
self.missing(parseFloat(data.mbmissing))
self.direct_unpack(data.direct_unpack)
self.category(data.cat);
self.priority(parent.priorityName[data.priority]);
self.script(data.script);
self.unpackopts(parseInt(data.unpackopts)) // UnpackOpts fails if not parseInt'd!
self.pausedStatus(data.status == 'Paused');
self.timeLeft(data.timeleft);
// If exists, otherwise false
if(data.rating_avg_video !== undefined) {
self.rating_avg_video(data.rating_avg_video === 0 ? '-' : data.rating_avg_video);
self.rating_avg_audio(data.rating_avg_audio === 0 ? '-' : data.rating_avg_audio);
}
};
// Pause individual download

View File

@@ -50,7 +50,8 @@ legend,
color: white !important;
}
.hover-button {
.hover-button,
.fileControls a:hover {
opacity: 0.7;
}
@@ -81,7 +82,8 @@ legend,
.max-speed-input-clear,
.max-speed-input-clear:hover,
.nav-tabs>li>a:hover {
.nav-tabs>li>a:hover,
.fileControls a {
color: black;
}
@@ -175,7 +177,7 @@ tbody .caret {
color: #D6D6D6;
}
td.name .name-ratings span,
td.name .name-icons span,
.navbar-nav .open .dropdown-menu>li>a,
.dropdown-header,
#modal-help small,

View File

@@ -617,6 +617,7 @@ td.name .row-wrap-text {
}
.queue-table td.name .name-options small,
.queue-table td.name .direct-unpack,
.queue-item-password {
opacity: 0.5;
}
@@ -626,7 +627,7 @@ td.name .row-wrap-text {
}
.queue-table td.name:hover .row-wrap-text {
max-width: calc(100% - 85px);
max-width: calc(100% - 125px);
/* Change for each size! */
}
@@ -648,18 +649,18 @@ td.name .row-wrap-text {
border: 1px solid #ccc;
}
td.name .name-ratings {
td.name .name-icons {
display: inline;
margin-left: 5px;
color: black !important;
text-decoration: none !important;
}
.queue-table td.name:hover .name-ratings {
.queue-table td.name:hover .name-icons {
display: none;
}
td.name .name-ratings .glyphicon {
td.name .name-icons .glyphicon {
margin-left: 2px;
}
@@ -769,6 +770,35 @@ tr.queue-item>td:first-child>a {
padding-right: 10px;
}
.item-files-table tr .fileControls{
float:right;
display:none;
}
.item-files-table tr.files-sortable:hover .fileControls{
float:right;
display:block;
margin-left:5px;
}
.progress .progress-bar .fileDetails {
display:inline;
text-align: left;
margin-left: 70px;
line-height: 25px;
position: absolute;
top: 0;
left: 0;
z-index: 2;
font-size: 12px;
color: #404040;
padding-right: 0px;
}
.progress .progress-bar .fileDetails>span {
float: left;
}
.progress strong {
font-size: 13px;
}
@@ -1035,7 +1065,7 @@ tr.queue-item>td:first-child>a {
opacity: 1;
}
.history-ratings .name-ratings {
.history-ratings .name-icons {
float: none !important;
}
@@ -1623,6 +1653,11 @@ input[name="nzbURL"] {
#modal-item-files .item-files-table .progress small {
color: #727272 !important;
margin-left: 5px;
}
#modal-item-files .item-files-table tr.files-sortable:hover .progress small {
display:none;
}
#modal-item-files .item-files-table td {
@@ -1810,7 +1845,7 @@ input[name="nzbURL"] {
}
@media screen and (max-width: 1200px) {
td.name .name-ratings {
td.name .name-icons {
margin-left: 0px;
margin-right: -5px;
display: block;
@@ -1857,6 +1892,11 @@ input[name="nzbURL"] {
.queue .sortable-placeholder td {
padding: 9px 0px 8px !important;
}
.queue-table .buttonMoveToBottom,
.queue-table .buttonMoveToTop {
display: inline;
}
}
@media screen and (min-height: 800px) {

View File

@@ -132,6 +132,11 @@ h2 {
max-width: calc(100% - 45px);
}
.queue-table .buttonMoveToBottom,
.queue-table .buttonMoveToTop {
display: none;
}
tr.queue-item>td:first-child>a {
margin-top: 3px;
}

View File

Binary file not shown.

View File

@@ -12,7 +12,7 @@ msgstr ""
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=ASCII\n"
"Content-Transfer-Encoding: 7bit\n"
"POT-Creation-Date: 2017-06-26 10:38+W. Europe Daylight Time\n"
"POT-Creation-Date: 2017-07-17 20:10+W. Europe Daylight Time\n"
"Generated-By: pygettext.py 1.5\n"
@@ -402,11 +402,29 @@ msgid "Unknown Error while decoding %s"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgid "%s => missing from all servers, discarding"
msgstr ""
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr ""
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr ""
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid "Jobs will start unpacking during the downloading to reduce post-processing time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -745,8 +763,6 @@ msgstr ""
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr ""
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -763,11 +779,6 @@ msgstr ""
msgid "Unpacking failed, archive requires a password"
msgstr ""
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr ""
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr ""
@@ -829,10 +840,6 @@ msgstr ""
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr ""
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr ""
@@ -1384,10 +1391,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr ""
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr ""
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr ""
@@ -1452,6 +1455,10 @@ msgstr ""
msgid "Download Completed"
msgstr ""
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr ""
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr ""
@@ -2960,6 +2967,10 @@ msgstr ""
msgid "Automatically sort items by (average) age."
msgstr ""
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid "Posts will be paused untill they are at least this age. Setting job priority to Force will skip the delay."
msgstr ""
@@ -3856,7 +3867,7 @@ msgstr ""
msgid "Delete"
msgstr ""
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr ""
@@ -3868,7 +3879,7 @@ msgstr ""
msgid "Down"
msgstr ""
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr ""

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-06-22 07:07+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Danish <da@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -427,13 +427,33 @@ msgstr "Forkert udformet yEnc artikel i %s"
msgid "Unknown Error while decoding %s"
msgstr "Ukendt fejl under afkodning af %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUENCODE detekteret, kun yEnc kodning understøttes [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => mangler fra alle servere, afviser"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUENCODE detekteret, kun yEnc kodning understøttes [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Udpakker"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Udpakket %s filer/mapper i %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -806,8 +826,6 @@ msgstr "[%s] Fejl \"%s\" under udpakning af RAR fil(er)"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Fejl \"%s\" når du køre rar_unpack på %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -824,11 +842,6 @@ msgstr "Forsøger unrar med adgangskode \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Udpakning mislykkedes, arkivet kræver password"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Udpakker"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Udpak"
@@ -890,10 +903,6 @@ msgstr "Ubrugelig RAR fil"
msgid "Corrupt RAR file"
msgstr "Ødelagt RAR fil"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Udpakket %s filer/mapper i %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s filer i %s"
@@ -1486,10 +1495,6 @@ msgstr "Overførslen kan mislykkes, kun %s af det krævede %s tilgængelig"
msgid "Download failed - Not on your server(s)"
msgstr "Download mislykkedes - ikke på din server (e)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Kan ikke oprette endelig mappe %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Ingen efterbehandling på grund af mislykket godkendelse"
@@ -1554,6 +1559,10 @@ msgstr "Det lykkedes ikke at fjerne arbejdsmappen (%s)"
msgid "Download Completed"
msgstr "Overførsel fuldført"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Kan ikke oprette endelig mappe %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Efterbehandling"
@@ -3179,6 +3188,10 @@ msgstr "Sortere efter alder"
msgid "Automatically sort items by (average) age."
msgstr "Sortere automatisk efter (gennemsnits) alder."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4137,7 +4150,7 @@ msgstr "Ændre NZB detaljer"
msgid "Delete"
msgstr "Slet"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Øverst"
@@ -4149,7 +4162,7 @@ msgstr "Op"
msgid "Down"
msgstr "Ned"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Bunden"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-06-22 07:06+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: German <de@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -446,13 +446,33 @@ msgstr "Ungültiger yEnc-Artikel in %s"
msgid "Unknown Error while decoding %s"
msgstr "Unbekannter Fehler %s beim Dekodieren"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode gefunden, nur yEnc Codierung wir unterstützt [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s wurde auf keinem Server gefunden und daher übersprungen"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode gefunden, nur yEnc Codierung wir unterstützt [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Entpacken"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s Datei(en)/Ordner entpackt in %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -827,8 +847,6 @@ msgstr "[%s] Fehler \"%s\" beim Entpacken der RAR-Dateien"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Fehler \"%s\" beim Ausführen von rar_unpack auf %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -845,11 +863,6 @@ msgstr "Versuche entpacken mit Passwort \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Entpacken fehlgeschlagen. Archiv benötigt ein Passwort."
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Entpacken"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Entpacken"
@@ -913,10 +926,6 @@ msgstr "RAR-Datei beschädigt"
msgid "Corrupt RAR file"
msgstr "Defekte RAR Datei"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s Datei(en)/Ordner entpackt in %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s Dateien in %s"
@@ -1531,10 +1540,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr "Download fehlgeschlagen - Nicht auf deinem/n Server/n vorhanden"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Konnte Download-Ordner %s nicht anlegen"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Keine Nachbearbeitung wegen fehlgeschlagener Überprüfung"
@@ -1599,6 +1604,10 @@ msgstr "Fehler beim Entfernen des Arbeitsverzeichnisses %s."
msgid "Download Completed"
msgstr "Download fertig"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Konnte Download-Ordner %s nicht anlegen"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Nachbearbeitung"
@@ -3248,6 +3257,10 @@ msgid "Automatically sort items by (average) age."
msgstr ""
"Einträge automatisch nach ihrem (durchschnittlichen) Alter sortieren."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4213,7 +4226,7 @@ msgstr "NZB-Details bearbeiten"
msgid "Delete"
msgstr "Löschen"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Ganz nach oben"
@@ -4225,7 +4238,7 @@ msgstr "Nach oben"
msgid "Down"
msgstr "Nach unten"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Ganz nach unten"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-06-22 07:07+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Spanish <es@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -428,13 +428,33 @@ msgstr "Articulo yEnc corrupto en %s"
msgid "Unknown Error while decoding %s"
msgstr "Error inespecifico mientras descodificando %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode detectada, la única codificación válida es Enc [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => faltando de todos servidores, desechando"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode detectada, la única codificación válida es Enc [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Descomprimiendo"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Descompresos %s archivos/directorios en %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -809,8 +829,6 @@ msgstr "[%s] Error \"%s\" al descomprimir ficheros RAR"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Error \"%s\" al ejecutar rar_unpack sobre %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -827,11 +845,6 @@ msgstr "Intentado descomprimir rar con contraseña \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Error al descomprimir; El archivo está protegido por contraseña"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Descomprimiendo"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Descomprimir"
@@ -894,10 +907,6 @@ msgstr "Archivo RAR inutilizable"
msgid "Corrupt RAR file"
msgstr "Fichero RAR corrupto"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Descompresos %s archivos/directorios en %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s archivos en %s"
@@ -1503,10 +1512,6 @@ msgstr "La descarga fallo, solo %s de los %s requeridos estan disponibles"
msgid "Download failed - Not on your server(s)"
msgstr "Descarga fallida - No está en tu(s) servidor(es)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Imposible crear directorio final %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "No se ha podido post-procesar debido a un fallo en la verificación"
@@ -1571,6 +1576,10 @@ msgstr "Error al eliminar el directorio de trabajo (%s)"
msgid "Download Completed"
msgstr "Descarga Completada"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Imposible crear directorio final %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Post-Procesado"
@@ -3198,6 +3207,10 @@ msgstr "Ordenar por antigüedad"
msgid "Automatically sort items by (average) age."
msgstr "Automáticamente ordenar elementos por antigüedad (promedio)."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4157,7 +4170,7 @@ msgstr "Editar Detalles de NZB"
msgid "Delete"
msgstr "Eliminar"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Superior"
@@ -4169,7 +4182,7 @@ msgstr "Encima"
msgid "Down"
msgstr "Abajo"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Último"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-06-22 07:07+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Finnish <fi@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -427,13 +427,33 @@ msgstr "Huonosti muotoiltu yEnc artikkeli %s"
msgid "Unknown Error while decoding %s"
msgstr "Tuntematon virhe dekoodattaessa %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode-koodaus havaittiin, vain yEnc-koodausta tuetaan [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => puuttuu kaikilta palvelimilta, hylätään"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode-koodaus havaittiin, vain yEnc-koodausta tuetaan [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Puretaan"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Purettiin %s tiedostoa/kansiota kohteeseen %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -804,8 +824,6 @@ msgstr "[%s] Virhe \"%s\" purettaessa RAR tiedostoja"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Virhe \"%s\" ajettaessa rar_unpack kohteelle %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -822,11 +840,6 @@ msgstr "Yritetään purkaa rar arkistoa salasanalla \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Purkaminen epäonnistui, arkisto vaatii salasanan"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Puretaan"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Pura"
@@ -888,10 +901,6 @@ msgstr "Käyttökelvoton RAR arkisto"
msgid "Corrupt RAR file"
msgstr "Korruptoitunut RAR arkisto"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Purettiin %s tiedostoa/kansiota kohteeseen %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s tiedostoa kohteessa %s"
@@ -1490,10 +1499,6 @@ msgstr "Lataaminen saattaa epäonnistua, vain %s osaa %s osasta saatavilla"
msgid "Download failed - Not on your server(s)"
msgstr "Lataus epäonnistui - Ei ole palvelimilla"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Ei voitu luoda lopullista kansiota %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Jälkikäsittelyä ei suoritettu, koska varmennus epäonnistui"
@@ -1558,6 +1563,10 @@ msgstr "Virhe poistettaessa työkansiota (%s)"
msgid "Download Completed"
msgstr "Lataus valmistui"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Ei voitu luoda lopullista kansiota %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Jälkikäsittely"
@@ -3192,6 +3201,10 @@ msgstr "Järjestä iän mukaan"
msgid "Automatically sort items by (average) age."
msgstr "Järjestelee kohteet (keskimääräisen) iän mukaan."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4148,7 +4161,7 @@ msgstr "NZB tietojen muokkaus"
msgid "Delete"
msgstr "Poista"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Ylin"
@@ -4160,7 +4173,7 @@ msgstr "Ylös"
msgid "Down"
msgstr "Alas"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Alin"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"PO-Revision-Date: 2017-06-22 07:05+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-07-17 19:05+0000\n"
"Last-Translator: Fred <88com88@gmail.com>\n"
"Language-Team: French <fr@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -448,13 +448,36 @@ msgstr "Article yEnc mal construit dans %s"
msgid "Unknown Error while decoding %s"
msgstr "Erreur inconnue lors du décodage de %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode détecté, seul l'encodage yEnc est compatible [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => absent de tous les serveurs, rejeté"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode détecté, seul l'encodage yEnc est compatible [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Extraction"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s fichier(s)/dossier(s) extrait(s) en %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr "La Décompression Directe a été activée automatiquement."
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
"Les tâches seront décompréssées pendant le téléchargement pour réduire le "
"temps de post-traitement. Fonctionne uniquement pour les tâches qui ne "
"nécessitent aucune réparation."
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -831,8 +854,6 @@ msgstr "[%s] Erreur \"%s\" lors de l'extraction des fichiers RAR"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Erreur \"%s\" lors de l'exécution de rar_unpack sur %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -849,11 +870,6 @@ msgstr "Tentative d'extraction avec le mot de passe \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Échec de l'extraction, l'archive nécessite un mot de passe"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Extraction"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Décompresser"
@@ -919,10 +935,6 @@ msgstr "Fichier RAR inutilisable"
msgid "Corrupt RAR file"
msgstr "Fichier RAR corrompu"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s fichier(s)/dossier(s) extrait(s) en %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s fichiers dans %s"
@@ -1045,7 +1057,7 @@ msgstr "Vérification"
#: sabnzbd/newsunpack.py
msgid "Verifying repair"
msgstr ""
msgstr "Vérification de la réparation"
#: sabnzbd/newsunpack.py [Error message]
msgid "Python script \"%s\" does not have execute (+x) permission set"
@@ -1541,10 +1553,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr "Le téléchargement a échoué - absent de vos serveur(s)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Impossible de créer le dossier final %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Pas de post-traitement car la vérification a echoué"
@@ -1609,6 +1617,10 @@ msgstr "Erreur lors de la suppression du dossier de travail (%s)"
msgid "Download Completed"
msgstr "Téléchargement terminé"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Impossible de créer le dossier final %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Post-traitement"
@@ -3261,6 +3273,10 @@ msgstr "Trier par âge"
msgid "Automatically sort items by (average) age."
msgstr "Trier automatiquement les fichiers par âge (moyen)."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr "Décompression Directe"
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4240,7 +4256,7 @@ msgstr "Éditer les détails du NZB"
msgid "Delete"
msgstr "Supprimer"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Tout en haut"
@@ -4252,7 +4268,7 @@ msgstr "Monter"
msgid "Down"
msgstr "Descendre"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Tout en bas"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"PO-Revision-Date: 2017-06-23 06:44+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-07-18 20:20+0000\n"
"Last-Translator: ION IL <Unknown>\n"
"Language-Team: Hebrew <he@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-19 05:49+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -421,13 +421,35 @@ msgstr "%s-נוצר באופן גרוע ב yEnc מאמר"
msgid "Unknown Error while decoding %s"
msgstr "%s שגיאה בלתי ידועה בעת פענוח"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "[%s] נתמכת yEnc התגלה, רק הצפנת UUencode"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "חסר מכל השרתים, משליך <= %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "[%s] נתמכת yEnc התגלה, רק הצפנת UUencode"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "פורק"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s-פורקו %s קבצים/תיקיות ב"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ".פריקה ישירה אופשרה באופן אוטומטי"
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
".עבודות יתחילו להיפרק במהלך ההורדה כדי להפחית זמן לאחר-עיבוד. עובד רק עבור "
"עבודות שאינן צריכות תיקון"
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -794,8 +816,6 @@ msgstr "RAR [%s] שגיאה \"%s\" בזמן פריקת קבצי"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "%s על rar_unpack שגיאת \"%s\" בזמן הרצת"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -812,11 +832,6 @@ msgstr "\"%s\" מנסה לחלץ עם הסיסמה"
msgid "Unpacking failed, archive requires a password"
msgstr "פריקה נכשלה, ארכיון דורש סיסמה"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "פורק"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "פרוק"
@@ -878,10 +893,6 @@ msgstr "בלתי שמיש RAR קובץ"
msgid "Corrupt RAR file"
msgstr "פגום RAR קובץ"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s-פורקו %s קבצים/תיקיות ב"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s קבצים ב %s"
@@ -998,7 +1009,7 @@ msgstr "בודק"
#: sabnzbd/newsunpack.py
msgid "Verifying repair"
msgstr ""
msgstr "מוודא תיקון"
#: sabnzbd/newsunpack.py [Error message]
msgid "Python script \"%s\" does not have execute (+x) permission set"
@@ -1477,10 +1488,6 @@ msgstr "הורדה עשויה להיכשל, רק %s מתוך %s דרושים ז
msgid "Download failed - Not on your server(s)"
msgstr "הורדה נכשלה - לא בשרת(ים) שלך"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "%s לא יכול ליצור תיקייה סופית"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "אין לאחר-עיבוד בגלל וידוא כושל"
@@ -1545,6 +1552,10 @@ msgstr "(%s) שגיאה בהסרת תיקיית עבודה"
msgid "Download Completed"
msgstr "הורדה הושלמה"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "%s לא יכול ליצור תיקייה סופית"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "לאחר-עיבוד"
@@ -2503,8 +2514,8 @@ msgid ""
"/>reconstruction of the queue content, preserving already downloaded "
"files.<br />This will modify the queue order."
msgstr ""
"ויעשה בניה SABnzbd כפתור התיקון יפעיל מחדש את<br />.מחדש מלאה של תוכן התור, "
"תוך שימור קבצים שהורדו כבר<br />.זה ישנה את סדר התור"
"ויעשה בניה מחדש מלאה של SABnzbd כפתור התיקון יפעיל מחדש את<br>.תוכן התור, "
"תוך שימור קבצים שהורדו כבר. זה ישנה את סדר התור"
#: sabnzbd/skintext.py # sabnzbd/skintext.py
msgid "Changes have not been saved, and will be lost."
@@ -3148,6 +3159,10 @@ msgstr "מיין לפי גיל"
msgid "Automatically sort items by (average) age."
msgstr ".(מיין פריטים באופן אוטומטי לפי גיל (ממוצע"
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr "פריקה ישירה"
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4097,7 +4112,7 @@ msgstr "ערוך פרטי NZB"
msgid "Delete"
msgstr "מחק"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "ראש"
@@ -4109,7 +4124,7 @@ msgstr "למעלה"
msgid "Down"
msgstr "למטה"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "תחתית"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-05-23 11:46+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Norwegian Bokmal <nb@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -423,13 +423,33 @@ msgstr "Feilaktigt utformet yEnc artikkel i %s"
msgid "Unknown Error while decoding %s"
msgstr "Ukjent feil oppstod under dekoding av %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode oppdaget, bare yEnc koding er støttet[%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => mangler på alle servere, fjerner"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode oppdaget, bare yEnc koding er støttet[%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Utpakker"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Utpakket %s filer/mapper på %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -800,8 +820,6 @@ msgstr "[%s] Feil \"%s\" under utpakking av RAR fil(er)"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Feil \"%s\" under kjøring av rar_unpack på %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -818,11 +836,6 @@ msgstr "Prøver unrar med passord \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Utpakking mislyktes, arkivet krever passord"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Utpakker"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Utpakking"
@@ -884,10 +897,6 @@ msgstr "Ubrukelig RAR-fil"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Utpakket %s filer/mapper på %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s filer på %s"
@@ -1482,10 +1491,6 @@ msgstr "Nedlasting kan feile, kun %s av kravet på %s tilgjengelig"
msgid "Download failed - Not on your server(s)"
msgstr "Nedlastning feilet - Finnes ikke på din(e) server(e)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Kan ikke opprette mappe %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Ingen etterbehandling, på grunn av misslykket verifisering"
@@ -1550,6 +1555,10 @@ msgstr "Kunne ikke fjerne arbeidsmappe (%s)"
msgid "Download Completed"
msgstr "Nedlasting ferdig"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Kan ikke opprette mappe %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Etterbehandling"
@@ -3159,6 +3168,10 @@ msgstr "Sortere etter alder"
msgid "Automatically sort items by (average) age."
msgstr "Sortere automatisk etter(midt) alder."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4103,7 +4116,7 @@ msgstr "Endre NZB detaljer"
msgid "Delete"
msgstr "Fjern"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Topp"
@@ -4115,7 +4128,7 @@ msgstr "Opp"
msgid "Down"
msgstr "Ned"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Bunn"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"PO-Revision-Date: 2017-06-22 07:05+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-07-17 20:20+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Dutch <nl@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -441,13 +441,36 @@ msgstr "Slecht opgemaakt yEnc-artikel in %s"
msgid "Unknown Error while decoding %s"
msgstr "Onbekende fout tijdens het decoderen van %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode gevonden, SABnzbd verwerkt alleen yEnc-codering [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => ontbreekt op alle servers, overslaan"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "UUencode gevonden, SABnzbd verwerkt alleen yEnc-codering [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Uitpakken"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s bestanden/mappen uitgepakt in %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr "Direct Uitpakken is automatisch ingeschakeld."
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
"Het uitpakken van opdrachten wordt al gestart tijdens het downloaden. Dit "
"verkort de tijd die nodig is voor het nabewerken. Dit werkt alleen als de "
"opdracht niet beschadigd is."
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -820,8 +843,6 @@ msgstr "[%s] Fout \"%s\" bij het uitpakken van RAR-bestanden"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Fout \"%s\" bij uitvoeren van 'rar_unpack' op %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -838,11 +859,6 @@ msgstr "Unrar proberen met wachtwoord \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Uitpakken mislukt, archief vereist wachtwoord"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Uitpakken"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Uitpakken"
@@ -904,10 +920,6 @@ msgstr "Onbruikbaar RAR-bestand"
msgid "Corrupt RAR file"
msgstr "Beschadigd RAR-bestand"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "%s bestanden/mappen uitgepakt in %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s bestanden in %s"
@@ -1025,7 +1037,7 @@ msgstr "Controleren"
#: sabnzbd/newsunpack.py
msgid "Verifying repair"
msgstr ""
msgstr "Reparatie controleren"
#: sabnzbd/newsunpack.py [Error message]
msgid "Python script \"%s\" does not have execute (+x) permission set"
@@ -1509,10 +1521,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr "Download mislukt - Niet meer op je server(s)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Kan bestemmingsmap %s niet maken"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Geen nabewerking vanwege mislukte verificatie"
@@ -1577,6 +1585,10 @@ msgstr "Fout bij verwijderen van werkmap %s"
msgid "Download Completed"
msgstr "Download voltooid"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Kan bestemmingsmap %s niet maken"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Nabewerking"
@@ -3207,6 +3219,10 @@ msgstr "Sorteer op leeftijd"
msgid "Automatically sort items by (average) age."
msgstr "Automatisch sorteren op basis van gemiddelde leeftijd."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr "Direct Uitpakken"
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4180,7 +4196,7 @@ msgstr "Bewerk NZB Details"
msgid "Delete"
msgstr "Verwijder"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Boven"
@@ -4192,7 +4208,7 @@ msgstr "Hoger"
msgid "Down"
msgstr "Lager"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Onder"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2015-12-28 10:22+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Polish <pl@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:26+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -423,12 +423,32 @@ msgstr "Źle zbudowany artykuł yEnc w %s"
msgid "Unknown Error while decoding %s"
msgstr "Nieznany błąd podczas dekodowania %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => nie znaleziono na żadnym serwerze, porzucam"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Rozpakowywanie"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Rozpakowano %s plików/katalogów w %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -803,8 +823,6 @@ msgstr "[%s] Błąd \"%s\" podczas rozpakowywania plików RAR"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Błąd \"%s\" podczas uruchamiania rar_unpack na %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -821,11 +839,6 @@ msgstr "Próba rozpakowania archiwum RAR z użyciem hasła \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Rozpakowywanie nie powiodło się, archiwum wymaga podania hasła"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Rozpakowywanie"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Rozpakuj"
@@ -887,10 +900,6 @@ msgstr "Bezużyteczny plik RAR"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Rozpakowano %s plików/katalogów w %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s plików w %s"
@@ -1489,10 +1498,6 @@ msgstr "Pobieranie może się nie udać, dostępne jedynie %s z wymaganych %s"
msgid "Download failed - Not on your server(s)"
msgstr "Pobieranie nieudane - Dane niedostępne na skonfigurowanych serwerach"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Nie można utworzyć ostatecznego katalogu %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr ""
@@ -1558,6 +1563,10 @@ msgstr "Błąd usuwania katalogu roboczego (%s)"
msgid "Download Completed"
msgstr "Zakończono pobieranie"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Nie można utworzyć ostatecznego katalogu %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Przetwarzanie końcowe"
@@ -3174,6 +3183,10 @@ msgstr "Sortuj według wieku"
msgid "Automatically sort items by (average) age."
msgstr "Automatycznie sortuj pozycje według wieku (średniego)"
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4121,7 +4134,7 @@ msgstr "Edytuj szczegóły NZB"
msgid "Delete"
msgstr "Usuń"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Na górę"
@@ -4133,7 +4146,7 @@ msgstr "Wyżej"
msgid "Down"
msgstr "Niżej"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Na dół"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2016-01-01 22:58+0000\n"
"Last-Translator: lrrosa <Unknown>\n"
"Language-Team: Brazilian Portuguese <pt_BR@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:02+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -425,12 +425,32 @@ msgstr "Artigo yEnc mal formado em %s"
msgid "Unknown Error while decoding %s"
msgstr "Erro desconhecido ao decodificar %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => faltando em todos os servidores. Descartando"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Descompactando"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Descompactados %s arquivos/pastas em %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -803,8 +823,6 @@ msgstr "[%s] Erro \"%s\" ao descompactar os arquivos RAR"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Erro \"%s\" ao executar rar_unpack em %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -821,11 +839,6 @@ msgstr "Tentando descompactar com a senha \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "A descompactação falhou. O arquivo exige uma senha"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Descompactando"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Descompactar"
@@ -887,10 +900,6 @@ msgstr "Arquivo RAR inutilizável"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Descompactados %s arquivos/pastas em %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s arquivos em %s"
@@ -1488,10 +1497,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr "O download falhou - Não está em seu(s) servidor(s)"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Não é possível criar a pasta final %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Sem pós-processamento por causa de falha na verificação"
@@ -1556,6 +1561,10 @@ msgstr "Erro ao remover a pasta de trabalho (%s)"
msgid "Download Completed"
msgstr "Download concluído"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Não é possível criar a pasta final %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Pós-processamento"
@@ -3173,6 +3182,10 @@ msgstr "Ordernar por Idade"
msgid "Automatically sort items by (average) age."
msgstr "Classificar automaticamente os itens por (média de) idade."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4119,7 +4132,7 @@ msgstr "Editar Detalhes do NZB"
msgid "Delete"
msgstr "Eliminar"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Topo"
@@ -4131,7 +4144,7 @@ msgstr "Para cima"
msgid "Down"
msgstr "Para baixo"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Base"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2016-07-29 16:20+0000\n"
"Last-Translator: nicusor <Unknown>\n"
"Language-Team: Romanian <ro@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -427,12 +427,32 @@ msgstr "Articoul yEnc invalid în %s"
msgid "Unknown Error while decoding %s"
msgstr "Eroare Necunoscută în timpul decodării %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => lipsă de pe toate serverele, ignorare"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Dezarhivare"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Dezarhivat %s fişierele/dosarele în %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -806,8 +826,6 @@ msgstr "[%s] Eroare \"%s\" în timpul dezarhivării fişierelor RAR"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Eroare \"%s\" în timpul rar_unpack a %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -824,11 +842,6 @@ msgstr "Încerc unrar cu parola \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Dezarhivare nereuşită, arhiva necesită o parolă"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Dezarhivare"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Dezarhivează"
@@ -890,10 +903,6 @@ msgstr "Fișier RAR ce poate fi folosit"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Dezarhivat %s fişierele/dosarele în %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s fişiere în %s"
@@ -1494,10 +1503,6 @@ msgstr "Descărcarea ar putea eşua, doar %s din %s disponibil"
msgid "Download failed - Not on your server(s)"
msgstr "Descărcare euată, - Nu este pe serverul(ele) dumneavoastră"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Nu pot crea dosar final %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Nici o post-procesare din cauza verificării nereuşite"
@@ -1562,6 +1567,10 @@ msgstr "Eroare ştergere dosar curent (%s)"
msgid "Download Completed"
msgstr "Descărcare terminată"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Nu pot crea dosar final %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Post-procesare"
@@ -3172,6 +3181,10 @@ msgstr "Sortează după Vârstă"
msgid "Automatically sort items by (average) age."
msgstr "Sortează automat obiectele dupa vârstă (medie)."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4123,7 +4136,7 @@ msgstr "Editează Detalii NZB"
msgid "Delete"
msgstr "Şterge"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Vârf"
@@ -4135,7 +4148,7 @@ msgstr "Sus"
msgid "Down"
msgstr "Jos"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Coadă"

View File

@@ -2,15 +2,15 @@ msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-0.7.x\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2013-05-05 14:50+0000\n"
"Last-Translator: Pavel Maryanov <Unknown>\n"
"Language-Team: Russian <gmu@mx.ru>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
"Generated-By: pygettext.py 1.5\n"
#: SABnzbd.py [Error message]
@@ -416,12 +416,32 @@ msgstr "Неверно сформированная статья yEnc в %s"
msgid "Unknown Error while decoding %s"
msgstr "Неизвестная ошибка декодирования %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => отсутствует на всех серверах, отброшен"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Распаковка"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Распаковка %s файлов или папок в %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -793,8 +813,6 @@ msgstr "[%s] Ошибка распаковки RAR-файлов: %s"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Ошибка «%s» выполнения rar_unpack для %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -811,11 +829,6 @@ msgstr "Попытка распаковки RAR-архива с паролем
msgid "Unpacking failed, archive requires a password"
msgstr "Ошибка распаковки: архив защищён паролем"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Распаковка"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Распаковать"
@@ -877,10 +890,6 @@ msgstr ""
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Распаковка %s файлов или папок в %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s файлов в %s"
@@ -1480,10 +1489,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr ""
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Не удаётся создать конечную папку %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Отмена пост-обработка из-за ошибки проверки"
@@ -1548,6 +1553,10 @@ msgstr "Не удалось удалить рабочий каталог (%s)"
msgid "Download Completed"
msgstr "Загрузка завершена"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Не удаётся создать конечную папку %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Пост-обработка"
@@ -3151,6 +3160,10 @@ msgstr "Сортировать по возрасту"
msgid "Automatically sort items by (average) age."
msgstr "Автоматически сортировать элементы по (среднему) возрасту"
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4101,7 +4114,7 @@ msgstr "Изменить данные NZB"
msgid "Delete"
msgstr "Удалить"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "В начало"
@@ -4113,7 +4126,7 @@ msgstr "Вверх"
msgid "Down"
msgstr "Вниз"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "В конец"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: ОZZII <ozzii.translate@gmail.com>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2015-12-28 10:25+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Serbian <sr@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -420,12 +420,32 @@ msgstr "Лоше формиран yEnc артикал у %s"
msgid "Unknown Error while decoding %s"
msgstr "Nepoznata greška pri dešifrovanju %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => фали на свим серверима, одбацивање"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Распакивање"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Издвојено %s датотека/фасцикла у %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -795,8 +815,6 @@ msgstr "[%s] Greška \"%s\" pri raspakivanju RAR datoteka"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Грешка \"%s\" док сам радио 'rar_unpack' на %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -813,11 +831,6 @@ msgstr "Proba raspakivanja sa lozinkom \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "Neuspešno raspakivanje, arhiva zahteva lozinku"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Распакивање"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Распакуј"
@@ -879,10 +892,6 @@ msgstr "Neupotrebljiva RAR datoteka"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Издвојено %s датотека/фасцикла у %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s датотека у %s"
@@ -1474,10 +1483,6 @@ msgstr "Преузимање је можда погрешно. има %s од п
msgid "Download failed - Not on your server(s)"
msgstr "Неуспешно преузимање - није на вашем серверу"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Немогуће креирање фасцикле %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Нема пост-процесирање пошто провера није успела"
@@ -1542,6 +1547,10 @@ msgstr "Грешка у брисању радне фасцикле (%s)"
msgid "Download Completed"
msgstr "Преузимање завршено"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Немогуће креирање фасцикле %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Пост-процесирање"
@@ -3146,6 +3155,10 @@ msgstr "Сортирај по старост"
msgid "Automatically sort items by (average) age."
msgstr "Аутоматско сортирај ставке по старост (просек)."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4087,7 +4100,7 @@ msgstr "Уреди детаље NZB-а"
msgid "Delete"
msgstr "Обриши"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Врх"
@@ -4099,7 +4112,7 @@ msgstr "Горе"
msgid "Down"
msgstr "Доле"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Дно"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2016-02-20 20:34+0000\n"
"Last-Translator: shypike <Unknown>\n"
"Language-Team: Swedish <sv@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:01+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -421,12 +421,32 @@ msgstr "Felaktigt utformad yEnc artikel i %s"
msgid "Unknown Error while decoding %s"
msgstr "Okänt fel under avkodning av %s"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr ""
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => saknas från alla servrar, kastar"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Packar upp"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Uppackad %s filer/mappar i %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
@@ -799,8 +819,6 @@ msgstr "[%s] Fel \"%s\" under uppackning av RAR fil(er)"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "Fel \"%s\" när du kör rar_unpack på %s"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -817,11 +835,6 @@ msgstr "Försöker att packa upp med lösenord %s"
msgid "Unpacking failed, archive requires a password"
msgstr "Uppackning misslyckades, arkivet kräver lösenord"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "Packar upp"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "Packa upp"
@@ -883,10 +896,6 @@ msgstr "Oanvändbar RAR-fil"
msgid "Corrupt RAR file"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "Uppackad %s filer/mappar i %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s filer i %s"
@@ -1485,10 +1494,6 @@ msgstr ""
msgid "Download failed - Not on your server(s)"
msgstr "Nerladdning misslyckades - Inte på din server eller servrar"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "Kan inte skapa slutgiltig mapp %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "Ingen efterbehandling på grund av misslyckad verifiering"
@@ -1553,6 +1558,10 @@ msgstr "Det gick inte att ta bort arbetsmapp (%s)"
msgid "Download Completed"
msgstr "Hämtningen slutfördes"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "Kan inte skapa slutgiltig mapp %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "Efterbehandling"
@@ -3159,6 +3168,10 @@ msgstr "Sortera efter ålder"
msgid "Automatically sort items by (average) age."
msgstr "Sortera automatiskt efter (medel) ålder."
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4102,7 +4115,7 @@ msgstr "Ändra NZB detaljer"
msgid "Delete"
msgstr "Ta bort"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "Topp"
@@ -4114,7 +4127,7 @@ msgstr "Upp"
msgid "Down"
msgstr "Ner"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "Botten"

View File

@@ -7,15 +7,15 @@ msgid ""
msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2017-06-26 23:00+0000\n"
"POT-Creation-Date: 2017-07-17 18:42+0000\n"
"PO-Revision-Date: 2017-06-22 07:06+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"Language-Team: Chinese (Simplified) <zh_CN@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2017-06-27 06:02+0000\n"
"X-Generator: Launchpad (build 18416)\n"
"X-Launchpad-Export-Date: 2017-07-18 05:27+0000\n"
"X-Generator: Launchpad (build 18419)\n"
#: SABnzbd.py [Error message]
msgid "Failed to start web-interface"
@@ -415,13 +415,33 @@ msgstr "yEnc 文章格式错误:%s"
msgid "Unknown Error while decoding %s"
msgstr "解码 %s 时发生未知错误"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "检测到 UUencode但是仅有 yEnc 编码受支持 [%s]"
#: sabnzbd/decoder.py
msgid "%s => missing from all servers, discarding"
msgstr "%s => 所有服务器均缺失,正在舍弃"
#: sabnzbd/decoder.py
msgid "UUencode detected, only yEnc encoding is supported [%s]"
msgstr "检测到 UUencode但是仅有 yEnc 编码受支持 [%s]"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "正在解压"
#: sabnzbd/directunpacker.py # sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "已解压 %s 个文件/文件夹,耗时 %s"
#: sabnzbd/directunpacker.py [Warning message]
msgid "Direct Unpack was automatically enabled."
msgstr ""
#: sabnzbd/directunpacker.py [Warning message] # sabnzbd/skintext.py
msgid ""
"Jobs will start unpacking during the downloading to reduce post-processing "
"time. Only works for jobs that do not need repair."
msgstr ""
#: sabnzbd/dirscanner.py [Error message] # sabnzbd/dirscanner.py [Error message]
msgid "Error removing %s"
@@ -782,8 +802,6 @@ msgstr "[%s] \"%s\" 解压 RAR 文件时出错"
msgid "Error \"%s\" while running rar_unpack on %s"
msgstr "出现错误 \"%s\",正对 %s 执行 rar_unpack 操作"
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
#: sabnzbd/newsunpack.py [Warning message] # sabnzbd/newsunpack.py [Warning message]
@@ -800,11 +818,6 @@ msgstr "正在尝试 unrar使用密码 \"%s\""
msgid "Unpacking failed, archive requires a password"
msgstr "解压失败,压缩文件需要密码"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "Unpacking"
msgstr "正在解压"
#: sabnzbd/newsunpack.py # sabnzbd/skintext.py [PP phase "unpack"]
msgid "Unpack"
msgstr "解压"
@@ -866,10 +879,6 @@ msgstr "无法使用的 RAR 文件"
msgid "Corrupt RAR file"
msgstr "损坏的 RAR 文件"
#: sabnzbd/newsunpack.py
msgid "Unpacked %s files/folders in %s"
msgstr "已解压 %s 个文件/文件夹,耗时 %s"
#: sabnzbd/newsunpack.py # sabnzbd/newsunpack.py
msgid "%s files in %s"
msgstr "%s 个文件,耗时 %s"
@@ -1458,10 +1467,6 @@ msgstr "下载可能会失败,只有 %s 块 (需要 %s) 可用"
msgid "Download failed - Not on your server(s)"
msgstr "下载失败 - 不在该服务器上"
#: sabnzbd/postproc.py
msgid "Cannot create final folder %s"
msgstr "无法创建最终文件夹 %s"
#: sabnzbd/postproc.py
msgid "No post-processing because of failed verification"
msgstr "由于验证失败,未进行后期处理"
@@ -1526,6 +1531,10 @@ msgstr "移除工作目录出错 (%s)"
msgid "Download Completed"
msgstr "下载完成"
#: sabnzbd/postproc.py [Error message]
msgid "Cannot create final folder %s"
msgstr "无法创建最终文件夹 %s"
#: sabnzbd/postproc.py
msgid "Post-processing"
msgstr "后期处理"
@@ -3094,6 +3103,10 @@ msgstr "按发布时间排列"
msgid "Automatically sort items by (average) age."
msgstr "自动按 (平均) 发布时间排列项目。"
#: sabnzbd/skintext.py
msgid "Direct Unpack"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Posts will be paused untill they are at least this age. Setting job priority "
@@ -4029,7 +4042,7 @@ msgstr "编辑 NZB 详情"
msgid "Delete"
msgstr "删除"
#: sabnzbd/skintext.py [Job details page, move file to top]
#: sabnzbd/skintext.py [Job details page, move file to top] # sabnzbd/skintext.py
msgid "Top"
msgstr "置顶"
@@ -4041,7 +4054,7 @@ msgstr "上移"
msgid "Down"
msgstr "下移"
#: sabnzbd/skintext.py [Job details page, move file to bottom]
#: sabnzbd/skintext.py [Job details page, move file to bottom] # sabnzbd/skintext.py
msgid "Bottom"
msgstr "置底"

View File

@@ -106,6 +106,7 @@ import sabnzbd.cfg as cfg
import sabnzbd.database
import sabnzbd.lang as lang
import sabnzbd.api
import sabnzbd.directunpacker as directunpacker
from sabnzbd.decorators import synchronized, notify_downloader
from sabnzbd.constants import NORMAL_PRIORITY, VALID_ARCHIVES, GIGI, \
REPAIR_REQUEST, QUEUE_FILE_NAME, QUEUE_VERSION, QUEUE_FILE_TMPL
@@ -307,6 +308,7 @@ def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0
if cfg.sched_converted() != 2:
cfg.schedules.set(['%s %s' % (1, schedule) for schedule in cfg.schedules()])
cfg.sched_converted.set(2)
config.save_config()
if check_repair_request():
repair = 2
@@ -383,6 +385,8 @@ def halt():
sabnzbd.zconfig.remove_server()
sabnzbd.directunpacker.abort_all()
rss.stop()
logging.debug('Stopping URLGrabber')

View File

@@ -1347,6 +1347,7 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
slot['percentage'] = "%s" % (int(((mb - mbleft) / mb) * 100)) if mb != mbleft else '0'
slot['missing'] = pnfo.missing
slot['mbmissing'] = "%.2f" % (pnfo.bytes_missing / MEBI)
slot['direct_unpack'] = pnfo.direct_unpack
if not output:
slot['mb_fmt'] = locale.format('%d', int(mb), True)
slot['mbdone_fmt'] = locale.format('%d', int(mb - mbleft), True)
@@ -1517,7 +1518,6 @@ def options_list(output):
return report(output, keyword='options', data={
'yenc': sabnzbd.decoder.HAVE_YENC,
'par2': sabnzbd.newsunpack.PAR2_COMMAND,
'par2c': sabnzbd.newsunpack.PAR2C_COMMAND,
'multipar': sabnzbd.newsunpack.MULTIPAR_COMMAND,
'rar': sabnzbd.newsunpack.RAR_COMMAND,
'zip': sabnzbd.newsunpack.ZIP_COMMAND,

View File

@@ -30,8 +30,8 @@ import hashlib
import sabnzbd
from sabnzbd.misc import get_filepath, sanitize_filename, get_unique_filename, renamer, \
set_permissions, flag_file, long_path, clip_path, has_win_device, get_all_passwords
from sabnzbd.constants import QCHECK_FILE, Status
set_permissions, long_path, clip_path, has_win_device, get_all_passwords
from sabnzbd.constants import Status
import sabnzbd.cfg as cfg
from sabnzbd.articlecache import ArticleCache
from sabnzbd.postproc import PostProcessor
@@ -70,18 +70,16 @@ class Assembler(Thread):
if nzf:
sabnzbd.CheckFreeSpace()
# We allow win_devices because otherwise par2cmdline fails to repair
filename = sanitize_filename(nzf.filename, allow_win_devices=True)
filename = sanitize_filename(nzf.filename)
nzf.filename = filename
dupe = nzo.check_for_dupe(nzf)
filepath = get_filepath(long_path(cfg.download_dir.get_path()), nzo, filename)
if filepath:
logging.info('Decoding %s %s', filepath, nzf.type)
try:
filepath = _assemble(nzf, filepath, dupe)
filepath = self.assemble(nzf, filepath, dupe)
except IOError, (errno, strerror):
# If job was deleted, ignore error
if not nzo.is_gone():
@@ -100,7 +98,7 @@ class Assembler(Thread):
nzf.remove_admin()
setname = nzf.setname
if nzf.is_par2 and (nzo.md5packs.get(setname) is None):
pack = GetMD5Hashes(filepath)[0]
pack = self.parse_par2_file(filepath, nzo.md5of16k)
if pack:
nzo.md5packs[setname] = pack
logging.debug('Got md5pack for set %s', setname)
@@ -113,15 +111,15 @@ class Assembler(Thread):
rar_encrypted, unwanted_file = check_encrypted_and_unwanted_files(nzo, filepath)
if rar_encrypted:
if cfg.pause_on_pwrar() == 1:
logging.warning(T('WARNING: Paused job "%s" because of encrypted RAR file (if supplied, all passwords were tried)'), nzo.final_name)
logging.warning(remove_warning_label(T('WARNING: Paused job "%s" because of encrypted RAR file (if supplied, all passwords were tried)')), nzo.final_name)
nzo.pause()
else:
logging.warning(T('WARNING: Aborted job "%s" because of encrypted RAR file (if supplied, all passwords were tried)'), nzo.final_name)
logging.warning(remove_warning_label(T('WARNING: Aborted job "%s" because of encrypted RAR file (if supplied, all passwords were tried)')), nzo.final_name)
nzo.fail_msg = T('Aborted, encryption detected')
sabnzbd.nzbqueue.NzbQueue.do.end_job(nzo)
if unwanted_file:
logging.warning(T('WARNING: In "%s" unwanted extension in RAR file. Unwanted file is %s '), nzo.final_name, unwanted_file)
logging.warning(remove_warning_label(T('WARNING: In "%s" unwanted extension in RAR file. Unwanted file is %s ')), nzo.final_name, unwanted_file)
logging.debug(T('Unwanted extension is in rar file %s'), filepath)
if cfg.action_on_unwanted_extensions() == 1 and nzo.unwanted_ext == 0:
logging.debug('Unwanted extension ... pausing')
@@ -134,54 +132,104 @@ class Assembler(Thread):
filter, reason = nzo_filtered_by_rating(nzo)
if filter == 1:
logging.warning(T('WARNING: Paused job "%s" because of rating (%s)'), nzo.final_name, reason)
logging.warning(remove_warning_label(T('WARNING: Paused job "%s" because of rating (%s)')), nzo.final_name, reason)
nzo.pause()
elif filter == 2:
logging.warning(T('WARNING: Aborted job "%s" because of rating (%s)'), nzo.final_name, reason)
logging.warning(remove_warning_label(T('WARNING: Aborted job "%s" because of rating (%s)')), nzo.final_name, reason)
nzo.fail_msg = T('Aborted, rating filter matched (%s)') % reason
sabnzbd.nzbqueue.NzbQueue.do.end_job(nzo)
if rarfile.is_rarfile(filepath):
nzo.add_to_direct_unpacker(nzf)
else:
sabnzbd.nzbqueue.NzbQueue.do.remove(nzo.nzo_id, add_to_history=False, cleanup=False)
PostProcessor.do.process(nzo)
def assemble(self, nzf, path, dupe):
""" Assemble a NZF from its table of articles """
if os.path.exists(path):
unique_path = get_unique_filename(path)
if dupe:
path = unique_path
else:
renamer(path, unique_path)
def _assemble(nzf, path, dupe):
if os.path.exists(path):
unique_path = get_unique_filename(path)
if dupe:
path = unique_path
else:
renamer(path, unique_path)
md5 = hashlib.md5()
fout = open(path, 'ab')
decodetable = nzf.decodetable
md5 = hashlib.md5()
fout = open(path, 'ab')
decodetable = nzf.decodetable
for articlenum in decodetable:
# Break if deleted during writing
if nzf.nzo.status is Status.DELETED:
break
for articlenum in decodetable:
# Break if deleted during writing
if nzf.nzo.status is Status.DELETED:
break
# Sleep to allow decoder/assembler switching
sleep(0.0001)
article = decodetable[articlenum]
# Sleep to allow decoder/assembler switching
sleep(0.0001)
article = decodetable[articlenum]
data = ArticleCache.do.load_article(article)
data = ArticleCache.do.load_article(article)
if not data:
logging.info(T('%s missing'), article)
else:
# yenc data already decoded, flush it out
fout.write(data)
md5.update(data)
if not data:
logging.info(T('%s missing'), article)
else:
# yenc data already decoded, flush it out
fout.write(data)
md5.update(data)
fout.flush()
fout.close()
set_permissions(path)
nzf.md5sum = md5.digest()
del md5
fout.flush()
fout.close()
set_permissions(path)
nzf.md5sum = md5.digest()
del md5
return path
return path
def parse_par2_file(self, fname, table16k):
""" Get the hash table and the first-16k hash table from a PAR2 file
Return as dictionary, indexed on names or hashes for the first-16 table
For a full description of the par2 specification, visit:
http://parchive.sourceforge.net/docs/specifications/parity-volume-spec/article-spec.html
"""
table = {}
duplicates16k = []
try:
f = open(fname, 'rb')
except:
return table
try:
header = f.read(8)
while header:
name, hash, hash16k = parse_par2_file_packet(f, header)
if name:
table[name] = hash
if hash16k not in table16k:
table16k[hash16k] = name
else:
# Not unique, remove to avoid false-renames
duplicates16k.append(hash16k)
header = f.read(8)
except (struct.error, IndexError):
logging.info('Cannot use corrupt par2 file for QuickCheck, "%s"', fname)
table = {}
except:
logging.debug('QuickCheck parser crashed in file %s', fname)
logging.info('Traceback: ', exc_info=True)
table = {}
f.close()
# Have to remove duplicates at the end to make sure
# no trace is left in case of multi-duplicates
for hash16k in duplicates16k:
if hash16k in table16k:
old_name = table16k.pop(hash16k)
logging.debug('Par2-16k signature of %s not unique, discarding', old_name)
return table
def file_has_articles(nzf):
@@ -199,47 +247,10 @@ def file_has_articles(nzf):
return has
# For a full description of the par2 specification, visit:
# http://parchive.sourceforge.net/docs/specifications/parity-volume-spec/article-spec.html
def GetMD5Hashes(fname, force=False):
""" Get the hash table from a PAR2 file
Return as dictionary, indexed on names and True for utf8-encoded names
"""
new_encoding = True
table = {}
if force or not flag_file(os.path.split(fname)[0], QCHECK_FILE):
try:
f = open(fname, 'rb')
except:
return table, new_encoding
new_encoding = False
try:
header = f.read(8)
while header:
name, hash = ParseFilePacket(f, header)
new_encoding |= is_utf8(name)
if name:
table[name] = hash
header = f.read(8)
except (struct.error, IndexError):
logging.info('Cannot use corrupt par2 file for QuickCheck, "%s"', fname)
table = {}
except:
logging.debug('QuickCheck parser crashed in file %s', fname)
logging.info('Traceback: ', exc_info=True)
table = {}
f.close()
return table, new_encoding
def ParseFilePacket(f, header):
def parse_par2_file_packet(f, header):
""" Look up and analyze a FileDesc package """
nothing = None, None
nothing = None, None, None
if header != 'PAR2\0PKT':
return nothing
@@ -271,8 +282,9 @@ def ParseFilePacket(f, header):
for offset in range(0, len, 8):
if data[offset:offset + 16] == "PAR 2.0\0FileDesc":
hash = data[offset + 32:offset + 48]
hash16k = data[offset + 48:offset + 64]
filename = data[offset + 72:].strip('\0')
return filename, hash
return filename, hash, hash16k
return nothing
@@ -429,3 +441,11 @@ def rating_filtered(rating, filename, abort):
if any(check_keyword(k) for k in keywords.split(',')):
return T('keywords')
return None
def remove_warning_label(msg):
""" Standardize errors by removing obsolete
"WARNING:" part in all languages """
if ':' in msg:
return msg.split(':')[1]
return msg

View File

@@ -59,54 +59,116 @@ if sabnzbd.WIN32:
else:
DEF_FOLDER_MAX = 256
##############################################################################
# Configuration instances
##############################################################################
sfv_check = OptionBool('misc', 'sfv_check', True)
quick_check_ext_ignore = OptionList('misc', 'quick_check_ext_ignore', ['nfo', 'sfv', 'srr'])
email_server = OptionStr('misc', 'email_server', validation=validate_server)
email_to = OptionList('misc', 'email_to', validation=validate_email)
email_from = OptionStr('misc', 'email_from', validation=validate_email)
email_account = OptionStr('misc', 'email_account')
email_pwd = OptionPassword('misc', 'email_pwd')
email_endjob = OptionNumber('misc', 'email_endjob', 0, 0, 2)
email_full = OptionBool('misc', 'email_full', False)
email_dir = OptionDir('misc', 'email_dir', create=True)
email_rss = OptionBool('misc', 'email_rss', False)
##############################################################################
# Special settings
##############################################################################
pre_script = OptionStr('misc', 'pre_script', 'None')
queue_complete = OptionStr('misc', 'queue_complete')
queue_complete_pers = OptionBool('misc', 'queue_complete_pers', False)
bandwidth_perc = OptionNumber('misc', 'bandwidth_perc', 0, 0, 100)
refresh_rate = OptionNumber('misc', 'refresh_rate', 0)
log_level = OptionNumber('logging', 'log_level', 1, -1, 2)
log_size = OptionStr('logging', 'max_log_size', '5242880')
log_backups = OptionNumber('logging', 'log_backups', 5, 1, 1024)
queue_limit = OptionNumber('misc', 'queue_limit', 20, 0)
configlock = OptionBool('misc', 'config_lock', 0)
##############################################################################
# One time trackers
##############################################################################
converted_nzo_pickles = OptionBool('misc', 'converted_nzo_pickles', False)
warned_old_queue = OptionNumber('misc', 'warned_old_queue', QUEUE_VERSION)
sched_converted = OptionBool('misc', 'sched_converted', False)
notified_new_skin = OptionNumber('misc', 'notified_new_skin', 0)
direct_unpack_tested = OptionBool('misc', 'direct_unpack_tested', False)
##############################################################################
# Config - General
##############################################################################
version_check = OptionNumber('misc', 'check_new_rel', 1)
autobrowser = OptionBool('misc', 'auto_browser', True)
replace_illegal = OptionBool('misc', 'replace_illegal', True)
pre_script = OptionStr('misc', 'pre_script', 'None')
script_can_fail = OptionBool('misc', 'script_can_fail', False)
start_paused = OptionBool('misc', 'start_paused', False)
language = OptionStr('misc', 'language', 'en')
enable_https_verification = OptionBool('misc', 'enable_https_verification', True)
selftest_host = OptionStr('misc', 'selftest_host', 'self-test.sabnzbd.org')
cherryhost = OptionStr('misc', 'host', DEF_HOST)
cherryport = OptionStr('misc', 'port', DEF_PORT)
https_port = OptionStr('misc', 'https_port')
username = OptionStr('misc', 'username')
password = OptionPassword('misc', 'password')
bandwidth_max = OptionStr('misc', 'bandwidth_max')
cache_limit = OptionStr('misc', 'cache_limit')
web_dir = OptionStr('misc', 'web_dir', DEF_STDINTF)
web_color = OptionStr('misc', 'web_color', '')
https_cert = OptionDir('misc', 'https_cert', 'server.cert', create=False)
https_key = OptionDir('misc', 'https_key', 'server.key', create=False)
https_chain = OptionDir('misc', 'https_chain', create=False)
enable_https = OptionBool('misc', 'enable_https', False)
inet_exposure = OptionNumber('misc', 'inet_exposure', 0, protect=True) # 0=local-only, 1=nzb, 2=api, 3=full_api, 4=webui, 5=webui with login for external
local_ranges = OptionList('misc', 'local_ranges', protect=True)
api_key = OptionStr('misc', 'api_key', create_api_key())
nzb_key = OptionStr('misc', 'nzb_key', create_api_key())
##############################################################################
# Config - Folders
##############################################################################
umask = OptionStr('misc', 'permissions', '', validation=validate_octal)
download_dir = OptionDir('misc', 'download_dir', DEF_DOWNLOAD_DIR, create=False, validation=validate_safedir)
download_free = OptionStr('misc', 'download_free')
complete_dir = OptionDir('misc', 'complete_dir', DEF_COMPLETE_DIR, create=False, apply_umask=True, validation=validate_notempty)
script_dir = OptionDir('misc', 'script_dir', create=True, writable=False)
nzb_backup_dir = OptionDir('misc', 'nzb_backup_dir', DEF_NZBBACK_DIR)
admin_dir = OptionDir('misc', 'admin_dir', DEF_ADMIN_DIR, validation=validate_safedir)
dirscan_dir = OptionDir('misc', 'dirscan_dir', create=False)
dirscan_speed = OptionNumber('misc', 'dirscan_speed', DEF_SCANRATE, 0, 3600)
password_file = OptionDir('misc', 'password_file', '', create=False)
log_dir = OptionDir('misc', 'log_dir', 'logs', validation=validate_notempty)
##############################################################################
# Config - Switches
##############################################################################
max_art_tries = OptionNumber('misc', 'max_art_tries', 3, 2)
load_balancing = OptionNumber('misc', 'load_balancing', 2)
top_only = OptionBool('misc', 'top_only', False)
sfv_check = OptionBool('misc', 'sfv_check', True)
quick_check_ext_ignore = OptionList('misc', 'quick_check_ext_ignore', ['nfo', 'sfv', 'srr'])
script_can_fail = OptionBool('misc', 'script_can_fail', False)
ssl_ciphers = OptionStr('misc', 'ssl_ciphers', '')
enable_unrar = OptionBool('misc', 'enable_unrar', True)
enable_unzip = OptionBool('misc', 'enable_unzip', True)
enable_7zip = OptionBool('misc', 'enable_7zip', True)
enable_recursive = OptionBool('misc', 'enable_recursive', True)
enable_filejoin = OptionBool('misc', 'enable_filejoin', True)
enable_tsjoin = OptionBool('misc', 'enable_tsjoin', True)
enable_par_cleanup = OptionBool('misc', 'enable_par_cleanup', True)
enable_all_par = OptionBool('misc', 'enable_all_par', False)
ignore_unrar_dates = OptionBool('misc', 'ignore_unrar_dates', False)
overwrite_files = OptionBool('misc', 'overwrite_files', False)
flat_unpack = OptionBool('misc', 'flat_unpack', False)
par_option = OptionStr('misc', 'par_option', '', validation=no_nonsense)
pre_check = OptionBool('misc', 'pre_check', False)
nice = OptionStr('misc', 'nice', '', validation=no_nonsense)
ionice = OptionStr('misc', 'ionice', '', validation=no_nonsense)
ignore_wrong_unrar = OptionBool('misc', 'ignore_wrong_unrar', False)
par2_multicore = OptionBool('misc', 'par2_multicore', True)
multipar = OptionBool('misc', 'multipar', sabnzbd.WIN32)
allow_streaming = OptionBool('misc', 'allow_streaming', False)
pre_check = OptionBool('misc', 'pre_check', False)
fail_hopeless_jobs = OptionBool('misc', 'fail_hopeless_jobs', True)
req_completion_rate = OptionNumber('misc', 'req_completion_rate', 100.2, 100, 200)
autodisconnect = OptionBool('misc', 'auto_disconnect', True)
no_dupes = OptionNumber('misc', 'no_dupes', 0)
no_series_dupes = OptionNumber('misc', 'no_series_dupes', 0)
pause_on_pwrar = OptionNumber('misc', 'pause_on_pwrar', 1)
ignore_samples = OptionBool('misc', 'ignore_samples', False)
auto_sort = OptionBool('misc', 'auto_sort', False)
direct_unpack = OptionBool('misc', 'direct_unpack', False)
direct_unpack_threads = OptionNumber('misc', 'direct_unpack_threads', 3, 1)
propagation_delay = OptionNumber('misc', 'propagation_delay', 0)
folder_rename = OptionBool('misc', 'folder_rename', True)
replace_spaces = OptionBool('misc', 'replace_spaces', False)
replace_dots = OptionBool('misc', 'replace_dots', False)
safe_postproc = OptionBool('misc', 'safe_postproc', True)
pause_on_post_processing = OptionBool('misc', 'pause_on_post_processing', False)
sanitize_safe = OptionBool('misc', 'sanitize_safe', False)
cleanup_list = OptionList('misc', 'cleanup_list')
unwanted_extensions = OptionList('misc', 'unwanted_extensions')
action_on_unwanted_extensions = OptionNumber('misc', 'action_on_unwanted_extensions', 0)
new_nzb_on_failure = OptionBool('misc', 'new_nzb_on_failure', False)
quota_size = OptionStr('misc', 'quota_size')
quota_day = OptionStr('misc', 'quota_day')
quota_resume = OptionBool('misc', 'quota_resume', False)
quota_period = OptionStr('misc', 'quota_period', 'm')
rating_enable = OptionBool('misc', 'rating_enable', False)
rating_host = OptionStr('misc', 'rating_host', 'api.oznzb.com')
@@ -129,40 +191,14 @@ rating_filter_pause_spam_confirm = OptionBool('misc', 'rating_filter_pause_spam_
rating_filter_pause_downvoted = OptionBool('misc', 'rating_filter_pause_downvoted', False)
rating_filter_pause_keywords = OptionStr('misc', 'rating_filter_pause_keywords')
top_only = OptionBool('misc', 'top_only', False)
autodisconnect = OptionBool('misc', 'auto_disconnect', True)
queue_complete = OptionStr('misc', 'queue_complete')
queue_complete_pers = OptionBool('misc', 'queue_complete_pers', False)
replace_spaces = OptionBool('misc', 'replace_spaces', False)
replace_dots = OptionBool('misc', 'replace_dots', False)
no_dupes = OptionNumber('misc', 'no_dupes', 0)
no_series_dupes = OptionNumber('misc', 'no_series_dupes', 0)
backup_for_duplicates = OptionBool('misc', 'backup_for_duplicates', True)
ignore_samples = OptionBool('misc', 'ignore_samples', False)
auto_sort = OptionBool('misc', 'auto_sort', False)
propagation_delay = OptionNumber('misc', 'propagation_delay', 0)
folder_rename = OptionBool('misc', 'folder_rename', True)
folder_max_length = OptionNumber('misc', 'folder_max_length', DEF_FOLDER_MAX, 20, 65000)
pause_on_pwrar = OptionNumber('misc', 'pause_on_pwrar', 1)
enable_meta = OptionBool('misc', 'enable_meta', True)
safe_postproc = OptionBool('misc', 'safe_postproc', True)
empty_postproc = OptionBool('misc', 'empty_postproc', False)
pause_on_post_processing = OptionBool('misc', 'pause_on_post_processing', False)
ampm = OptionBool('misc', 'ampm', False)
rss_filenames = OptionBool('misc', 'rss_filenames', False)
rss_odd_titles = OptionList('misc', 'rss_odd_titles', ['nzbindex.nl/', 'nzbindex.com/', 'nzbclub.com/'])
schedules = OptionList('misc', 'schedlines')
sched_converted = OptionBool('misc', 'sched_converted', False)
##############################################################################
# Config - Sorting
##############################################################################
enable_tv_sorting = OptionBool('misc', 'enable_tv_sorting', False)
tv_sort_string = OptionStr('misc', 'tv_sort_string')
tv_sort_countries = OptionNumber('misc', 'tv_sort_countries', 1)
tv_categories = OptionList('misc', 'tv_categories', '')
movie_rename_limit = OptionStr('misc', 'movie_rename_limit', '100M')
enable_movie_sorting = OptionBool('misc', 'enable_movie_sorting', False)
movie_sort_string = OptionStr('misc', 'movie_sort_string')
@@ -172,79 +208,90 @@ movie_categories = OptionList('misc', 'movie_categories', ['movies'])
enable_date_sorting = OptionBool('misc', 'enable_date_sorting', False)
date_sort_string = OptionStr('misc', 'date_sort_string')
date_categories = OptionStr('misc', 'date_categories', ['tv'])
date_categories = OptionList('misc', 'date_categories', ['tv'])
configlock = OptionBool('misc', 'config_lock', 0)
umask = OptionStr('misc', 'permissions', '', validation=validate_octal)
download_dir = OptionDir('misc', 'download_dir', DEF_DOWNLOAD_DIR, create=False, validation=validate_safedir)
download_free = OptionStr('misc', 'download_free')
complete_dir = OptionDir('misc', 'complete_dir', DEF_COMPLETE_DIR, create=False,
apply_umask=True, validation=validate_notempty)
script_dir = OptionDir('misc', 'script_dir', create=True, writable=False)
nzb_backup_dir = OptionDir('misc', 'nzb_backup_dir', DEF_NZBBACK_DIR)
admin_dir = OptionDir('misc', 'admin_dir', DEF_ADMIN_DIR, validation=validate_safedir)
dirscan_dir = OptionDir('misc', 'dirscan_dir', create=False)
dirscan_speed = OptionNumber('misc', 'dirscan_speed', DEF_SCANRATE, 0, 3600)
size_limit = OptionStr('misc', 'size_limit', '0')
password_file = OptionDir('misc', 'password_file', '', create=False)
fsys_type = OptionNumber('misc', 'fsys_type', 0, 0, 2)
##############################################################################
# Config - Scheduling and RSS
##############################################################################
schedules = OptionList('misc', 'schedlines')
rss_rate = OptionNumber('misc', 'rss_rate', 60, 15, 24 * 60)
##############################################################################
# Config - Specials
##############################################################################
# Bool switches
ampm = OptionBool('misc', 'ampm', False)
replace_illegal = OptionBool('misc', 'replace_illegal', True)
start_paused = OptionBool('misc', 'start_paused', False)
enable_all_par = OptionBool('misc', 'enable_all_par', False)
enable_par_cleanup = OptionBool('misc', 'enable_par_cleanup', True)
enable_unrar = OptionBool('misc', 'enable_unrar', True)
enable_unzip = OptionBool('misc', 'enable_unzip', True)
enable_7zip = OptionBool('misc', 'enable_7zip', True)
enable_filejoin = OptionBool('misc', 'enable_filejoin', True)
enable_tsjoin = OptionBool('misc', 'enable_tsjoin', True)
overwrite_files = OptionBool('misc', 'overwrite_files', False)
ignore_unrar_dates = OptionBool('misc', 'ignore_unrar_dates', False)
ignore_wrong_unrar = OptionBool('misc', 'ignore_wrong_unrar', False)
multipar = OptionBool('misc', 'multipar', sabnzbd.WIN32)
backup_for_duplicates = OptionBool('misc', 'backup_for_duplicates', True)
empty_postproc = OptionBool('misc', 'empty_postproc', False)
wait_for_dfolder = OptionBool('misc', 'wait_for_dfolder', False)
warn_empty_nzb = OptionBool('misc', 'warn_empty_nzb', True)
sanitize_safe = OptionBool('misc', 'sanitize_safe', False)
rss_filenames = OptionBool('misc', 'rss_filenames', False)
api_logging = OptionBool('misc', 'api_logging', True)
cherryhost = OptionStr('misc', 'host', DEF_HOST)
cherryport = OptionStr('misc', 'port', DEF_PORT)
https_port = OptionStr('misc', 'https_port')
username = OptionStr('misc', 'username')
password = OptionPassword('misc', 'password')
html_login = OptionBool('misc', 'html_login', True)
bandwidth_perc = OptionNumber('misc', 'bandwidth_perc', 0, 0, 100)
bandwidth_max = OptionStr('misc', 'bandwidth_max')
refresh_rate = OptionNumber('misc', 'refresh_rate', 0)
rss_rate = OptionNumber('misc', 'rss_rate', 60, 15, 24 * 60)
cache_limit = OptionStr('misc', 'cache_limit')
web_dir = OptionStr('misc', 'web_dir', DEF_STDINTF)
web_color = OptionStr('misc', 'web_color', '')
cleanup_list = OptionList('misc', 'cleanup_list')
warned_old_queue = OptionNumber('misc', 'warned_old_queue', QUEUE_VERSION)
notified_new_skin = OptionNumber('misc', 'notified_new_skin', 0)
converted_nzo_pickles = OptionBool('misc', 'converted_nzo_pickles', False)
unwanted_extensions = OptionList('misc', 'unwanted_extensions')
action_on_unwanted_extensions = OptionNumber('misc', 'action_on_unwanted_extensions', 0)
log_dir = OptionDir('misc', 'log_dir', 'logs', validation=validate_notempty)
log_level = OptionNumber('logging', 'log_level', 1, -1, 2)
log_size = OptionStr('logging', 'max_log_size', '5242880')
log_backups = OptionNumber('logging', 'log_backups', 5, 1, 1024)
https_cert = OptionDir('misc', 'https_cert', 'server.cert', create=False)
https_key = OptionDir('misc', 'https_key', 'server.key', create=False)
https_chain = OptionDir('misc', 'https_chain', create=False)
enable_https = OptionBool('misc', 'enable_https', False)
language = OptionStr('misc', 'language', 'en')
no_penalties = OptionBool('misc', 'no_penalties', False)
load_balancing = OptionNumber('misc', 'load_balancing', 2)
ipv6_servers = OptionNumber('misc', 'ipv6_servers', 1, 0, 2)
api_key = OptionStr('misc', 'api_key', create_api_key())
nzb_key = OptionStr('misc', 'nzb_key', create_api_key())
disable_key = OptionBool('misc', 'disable_api_key', False, protect=True)
api_warnings = OptionBool('misc', 'api_warnings', True, protect=True)
local_ranges = OptionList('misc', 'local_ranges', protect=True)
inet_exposure = OptionNumber('misc', 'inet_exposure', 0, protect=True) # 0=local-only, 1=nzb, 2=api, 3=full_api, 4=webui, 5=webui with login for external
max_art_tries = OptionNumber('misc', 'max_art_tries', 3, 2)
osx_menu = OptionBool('misc', 'osx_menu', True)
osx_speed = OptionBool('misc', 'osx_speed', True)
warn_dupl_jobs = OptionBool('misc', 'warn_dupl_jobs', True)
keep_awake = OptionBool('misc', 'keep_awake', True)
win_menu = OptionBool('misc', 'win_menu', True)
allow_incomplete_nzb = OptionBool('misc', 'allow_incomplete_nzb', False)
enable_bonjour = OptionBool('misc', 'enable_bonjour', True)
allow_duplicate_files = OptionBool('misc', 'allow_duplicate_files', False)
max_art_opt = OptionBool('misc', 'max_art_opt', False)
use_pickle = OptionBool('misc', 'use_pickle', False)
ipv6_hosting = OptionBool('misc', 'ipv6_hosting', False)
fixed_ports = OptionBool('misc', 'fixed_ports', False)
api_warnings = OptionBool('misc', 'api_warnings', True, protect=True)
disable_key = OptionBool('misc', 'disable_api_key', False, protect=True)
no_penalties = OptionBool('misc', 'no_penalties', False)
# Text values
rss_odd_titles = OptionList('misc', 'rss_odd_titles', ['nzbindex.nl/', 'nzbindex.com/', 'nzbclub.com/'])
folder_max_length = OptionNumber('misc', 'folder_max_length', DEF_FOLDER_MAX, 20, 65000)
req_completion_rate = OptionNumber('misc', 'req_completion_rate', 100.2, 100, 200)
selftest_host = OptionStr('misc', 'selftest_host', 'self-test.sabnzbd.org')
movie_rename_limit = OptionStr('misc', 'movie_rename_limit', '100M')
size_limit = OptionStr('misc', 'size_limit', '0')
fsys_type = OptionNumber('misc', 'fsys_type', 0, 0, 2)
show_sysload = OptionNumber('misc', 'show_sysload', 2, 0, 2)
history_limit = OptionNumber('misc', 'history_limit', 10, 0)
wait_ext_drive = OptionNumber('misc', 'wait_ext_drive', 5, 1, 60)
marker_file = OptionStr('misc', 'nomedia_marker', '')
ipv6_servers = OptionNumber('misc', 'ipv6_servers', 1, 0, 2)
##############################################################################
# Config - Notifications
##############################################################################
# [email]
email_server = OptionStr('misc', 'email_server', validation=validate_server)
email_to = OptionList('misc', 'email_to', validation=validate_email)
email_from = OptionStr('misc', 'email_from', validation=validate_email)
email_account = OptionStr('misc', 'email_account')
email_pwd = OptionPassword('misc', 'email_pwd')
email_endjob = OptionNumber('misc', 'email_endjob', 0, 0, 2)
email_full = OptionBool('misc', 'email_full', False)
email_dir = OptionDir('misc', 'email_dir', create=True)
email_rss = OptionBool('misc', 'email_rss', False)
email_cats = OptionList('misc', 'email_cats', ['*'])
# [ncenter]
ncenter_enable = OptionBool('ncenter', 'ncenter_enable', sabnzbd.DARWIN)
ncenter_cats = OptionList('ncenter', 'ncenter_cats', ['*'])
ncenter_prio_startup = OptionBool('ncenter', 'ncenter_prio_startup', True)
ncenter_prio_download = OptionBool('ncenter', 'ncenter_prio_download', False)
ncenter_prio_pp = OptionBool('ncenter', 'ncenter_prio_pp', False)
@@ -259,6 +306,7 @@ ncenter_prio_other = OptionBool('ncenter', 'ncenter_prio_other', False)
# [acenter]
acenter_enable = OptionBool('acenter', 'acenter_enable', sabnzbd.WIN32)
acenter_cats = OptionList('acenter', 'acenter_cats', ['*'])
acenter_prio_startup = OptionBool('acenter', 'acenter_prio_startup', False)
acenter_prio_download = OptionBool('acenter', 'acenter_prio_download', False)
acenter_prio_pp = OptionBool('acenter', 'acenter_prio_pp', False)
@@ -273,6 +321,7 @@ acenter_prio_other = OptionBool('acenter', 'acenter_prio_other', False)
# [ntfosd]
ntfosd_enable = OptionBool('ntfosd', 'ntfosd_enable', not sabnzbd.WIN32 and not sabnzbd.DARWIN)
ntfosd_cats = OptionList('ntfosd', 'ntfosd_cats', ['*'])
ntfosd_prio_startup = OptionBool('ntfosd', 'ntfosd_prio_startup', True)
ntfosd_prio_download = OptionBool('ntfosd', 'ntfosd_prio_download', False)
ntfosd_prio_pp = OptionBool('ntfosd', 'ntfosd_prio_pp', False)
@@ -287,6 +336,7 @@ ntfosd_prio_other = OptionBool('ntfosd', 'ntfosd_prio_other', False)
# [growl]
growl_enable = OptionBool('growl', 'growl_enable', False)
growl_cats = OptionList('growl', 'growl_cats', ['*'])
growl_server = OptionStr('growl', 'growl_server')
growl_password = OptionPassword('growl', 'growl_password')
growl_prio_startup = OptionBool('growl', 'growl_prio_startup', True)
@@ -303,6 +353,7 @@ growl_prio_other = OptionBool('growl', 'growl_prio_other', False)
# [prowl]
prowl_enable = OptionBool('prowl', 'prowl_enable', False)
prowl_cats = OptionList('prowl', 'prowl_cats', ['*'])
prowl_apikey = OptionStr('prowl', 'prowl_apikey')
prowl_prio_startup = OptionNumber('prowl', 'prowl_prio_startup', -3)
prowl_prio_download = OptionNumber('prowl', 'prowl_prio_download', -3)
@@ -321,6 +372,7 @@ pushover_token = OptionStr('pushover', 'pushover_token')
pushover_userkey = OptionStr('pushover', 'pushover_userkey')
pushover_device = OptionStr('pushover', 'pushover_device')
pushover_enable = OptionBool('pushover', 'pushover_enable')
pushover_cats = OptionList('pushover', 'pushover_cats', ['*'])
pushover_prio_startup = OptionNumber('pushover', 'pushover_prio_startup', -3)
pushover_prio_download = OptionNumber('pushover', 'pushover_prio_download', -2)
pushover_prio_pp = OptionNumber('pushover', 'pushover_prio_pp', -3)
@@ -335,6 +387,7 @@ pushover_prio_other = OptionNumber('pushover', 'pushover_prio_other', -3)
# [pushbullet]
pushbullet_enable = OptionBool('pushbullet', 'pushbullet_enable')
pushbullet_cats = OptionList('pushbullet', 'pushbullet_cats', ['*'])
pushbullet_apikey = OptionStr('pushbullet', 'pushbullet_apikey')
pushbullet_device = OptionStr('pushbullet', 'pushbullet_device')
pushbullet_prio_startup = OptionNumber('pushbullet', 'pushbullet_prio_startup', 0)
@@ -351,6 +404,7 @@ pushbullet_prio_other = OptionNumber('pushbullet', 'pushbullet_prio_other', 0)
# [nscript]
nscript_enable = OptionBool('nscript', 'nscript_enable')
nscript_cats = OptionList('nscript', 'nscript_cats', ['*'])
nscript_script = OptionStr('nscript', 'nscript_script')
nscript_parameters = OptionStr('nscript', 'nscript_parameters')
nscript_prio_startup = OptionBool('nscript', 'nscript_prio_startup', True)
@@ -365,26 +419,6 @@ nscript_prio_error = OptionBool('nscript', 'nscript_prio_error', False)
nscript_prio_queue_done = OptionBool('nscript', 'nscript_prio_queue_done', True)
nscript_prio_other = OptionBool('nscript', 'nscript_prio_other', False)
quota_size = OptionStr('misc', 'quota_size')
quota_day = OptionStr('misc', 'quota_day')
quota_resume = OptionBool('misc', 'quota_resume', False)
quota_period = OptionStr('misc', 'quota_period', 'm')
osx_menu = OptionBool('misc', 'osx_menu', True)
osx_speed = OptionBool('misc', 'osx_speed', True)
keep_awake = OptionBool('misc', 'keep_awake', True)
win_menu = OptionBool('misc', 'win_menu', True)
allow_incomplete_nzb = OptionBool('misc', 'allow_incomplete_nzb', False)
marker_file = OptionStr('misc', 'nomedia_marker', '')
wait_ext_drive = OptionNumber('misc', 'wait_ext_drive', 5, 1, 60)
queue_limit = OptionNumber('misc', 'queue_limit', 20, 0)
history_limit = OptionNumber('misc', 'history_limit', 10, 0)
show_sysload = OptionNumber('misc', 'show_sysload', 2, 0, 2)
enable_bonjour = OptionBool('misc', 'enable_bonjour', True)
allow_duplicate_files = OptionBool('misc', 'allow_duplicate_files', False)
warn_dupl_jobs = OptionBool('misc', 'warn_dupl_jobs', True)
new_nzb_on_failure = OptionBool('misc', 'new_nzb_on_failure', False)
##############################################################################
# Set root folders for Folder config-items

View File

@@ -27,6 +27,7 @@ import shutil
import time
import random
from hashlib import md5
from urlparse import urlparse
import sabnzbd.misc
from sabnzbd.constants import CONFIG_VERSION, NORMAL_PRIORITY, DEFAULT_PRIORITY, MAX_WIN_DFOLDER
from sabnzbd.utils import configobj
@@ -768,9 +769,6 @@ def _read_config(path, try_backup=False):
CFG['__encoding__'] = u'utf-8'
CFG['__version__'] = unicode(CONFIG_VERSION)
if 'misc' in CFG:
compatibility_fix(CFG['misc'])
# Use CFG data to set values for all static options
for section in database:
if section not in ('servers', 'categories', 'rss'):
@@ -854,6 +852,7 @@ def save_config(force=False):
# Write new config file
try:
logging.info('Writing settings to INI file %s', filename)
CFG.write()
shutil.copymode(bakname, filename)
modified = False
@@ -980,6 +979,21 @@ def define_rss():
def get_rss():
global database
try:
# We have to remove non-seperator commas by detecting if they are valid URL's
for feed_key in database['rss']:
feed = database['rss'][feed_key]
# Create a new corrected list
new_feed_uris = []
for feed_uri in feed.uri():
if new_feed_uris and not urlparse(feed_uri).scheme and urlparse(new_feed_uris[-1]).scheme:
# Current one has no scheme but previous one does, append to previous
new_feed_uris[-1] += '%2C' + feed_uri
continue
# Add full working URL
new_feed_uris.append(feed_uri)
# Set new list
feed.uri.set(new_feed_uris)
return database['rss']
except KeyError:
return {}
@@ -1087,22 +1101,3 @@ def create_api_key():
# Return a hex digest of the md5, eg 49f68a5c8493ec2c0bf489821c21fc3b
return m.hexdigest()
_FIXES = (
('enable_par_multicore', 'par2_multicore'),
)
def compatibility_fix(cf):
""" Convert obsolete INI entries """
for item in _FIXES:
old, new = item
try:
cf[new]
except KeyError:
try:
cf[new] = cf[old]
del cf[old]
except KeyError:
pass

View File

@@ -27,7 +27,7 @@ REC_RAR_VERSION = 500
PNFO = namedtuple('PNFO', 'repair unpack delete script nzo_id filename password unpackstrht '
'msgid category url bytes_left bytes avg_stamp avg_date finished_files '
'active_files queued_files status priority missing bytes_missing')
'active_files queued_files status priority missing bytes_missing direct_unpack')
QNFO = namedtuple('QNFO', 'bytes bytes_left bytes_left_previous_page list q_size_list q_fullsize')
@@ -47,7 +47,6 @@ SCAN_FILE_NAME = 'watched_data2.sab'
FUTURE_Q_FOLDER = 'future'
JOB_ADMIN = '__ADMIN__'
VERIFIED_FILE = '__verified__'
QCHECK_FILE = '__skip_qcheck__'
RENAMES_FILE = '__renames__'
ATTRIB_FILE = 'SABnzbd_attrib'
REPAIR_REQUEST = 'repair-all.sab'

View File

@@ -22,6 +22,7 @@ sabnzbd.decoder - article decoder
import binascii
import logging
import re
import hashlib
from time import sleep
from threading import Thread
@@ -30,8 +31,8 @@ from sabnzbd.constants import Status, MAX_DECODE_QUEUE, LIMIT_DECODE_QUEUE, SABY
import sabnzbd.articlecache
import sabnzbd.downloader
import sabnzbd.nzbqueue
from sabnzbd.encoding import yenc_name_fixer
from sabnzbd.misc import match_str
from sabnzbd.encoding import yenc_name_fixer, platform_encode
from sabnzbd.misc import match_str, is_obfuscated_filename
# Check for basic-yEnc
try:
@@ -68,6 +69,9 @@ class BadYenc(Exception):
Exception.__init__(self)
YDEC_TRANS = ''.join([chr((i + 256 - 42) % 256) for i in xrange(256)])
class Decoder(Thread):
def __init__(self, servers, queue):
@@ -113,7 +117,7 @@ class Decoder(Thread):
register = True
logging.debug("Decoding %s", art_id)
data = decode(article, lines, raw_data)
data = self.decode(article, lines, raw_data)
nzf.article_count += 1
found = True
@@ -176,7 +180,7 @@ class Decoder(Thread):
logging.info(logme)
if not found or killed:
new_server_found = self.__search_new_server(article)
new_server_found = self.search_new_server(article)
if new_server_found:
register = False
logme = None
@@ -185,19 +189,19 @@ class Decoder(Thread):
logme = T('Unknown Error while decoding %s') % art_id
logging.info(logme)
logging.info("Traceback: ", exc_info=True)
new_server_found = self.__search_new_server(article)
new_server_found = self.search_new_server(article)
if new_server_found:
register = False
logme = None
if logme:
if killed:
nzo.inc_log('killed_art_log', art_id)
nzo.increase_bad_articles_counter('killed_articles')
else:
nzo.inc_log('bad_art_log', art_id)
nzo.increase_bad_articles_counter('bad_articles')
else:
new_server_found = self.__search_new_server(article)
new_server_found = self.search_new_server(article)
if new_server_found:
register = False
elif nzo.precheck:
@@ -209,7 +213,100 @@ class Decoder(Thread):
if register:
sabnzbd.nzbqueue.NzbQueue.do.register_article(article, found)
def __search_new_server(self, article):
def decode(self, article, data, raw_data):
# Do we have SABYenc? Let it do all the work
if sabnzbd.decoder.SABYENC_ENABLED:
decoded_data, output_filename, crc, crc_expected, crc_correct = sabyenc.decode_usenet_chunks(raw_data, article.bytes)
# Assume it is yenc
article.nzf.type = 'yenc'
# Only set the name if it was found and not obfuscated
self.verify_filename(article, decoded_data, output_filename)
# CRC check
if not crc_correct:
raise CrcError(crc_expected, crc, decoded_data)
return decoded_data
# Continue for _yenc or Python-yEnc
# Filter out empty ones
data = filter(None, data)
# No point in continuing if we don't have any data left
if data:
nzf = article.nzf
yenc, data = yCheck(data)
ybegin, ypart, yend = yenc
decoded_data = None
# Deal with non-yencoded posts
if not ybegin:
found = False
try:
for i in xrange(min(40, len(data))):
if data[i].startswith('begin '):
nzf.type = 'uu'
found = True
# Pause the job and show warning
if nzf.nzo.status != Status.PAUSED:
nzf.nzo.pause()
msg = T('UUencode detected, only yEnc encoding is supported [%s]') % nzf.nzo.final_name
logging.warning(msg)
break
except IndexError:
raise BadYenc()
if found:
decoded_data = ''
else:
raise BadYenc()
# Deal with yenc encoded posts
elif ybegin and yend:
if 'name' in ybegin:
output_filename = yenc_name_fixer(ybegin['name'])
else:
output_filename = None
logging.debug("Possible corrupt header detected => ybegin: %s", ybegin)
nzf.type = 'yenc'
# Decode data
if HAVE_YENC:
decoded_data, crc = _yenc.decode_string(''.join(data))[:2]
partcrc = '%08X' % ((crc ^ -1) & 2 ** 32L - 1)
else:
data = ''.join(data)
for i in (0, 9, 10, 13, 27, 32, 46, 61):
j = '=%c' % (i + 64)
data = data.replace(j, chr(i))
decoded_data = data.translate(YDEC_TRANS)
crc = binascii.crc32(decoded_data)
partcrc = '%08X' % (crc & 2 ** 32L - 1)
if ypart:
crcname = 'pcrc32'
else:
crcname = 'crc32'
if crcname in yend:
_partcrc = yenc_name_fixer('0' * (8 - len(yend[crcname])) + yend[crcname].upper())
else:
_partcrc = None
logging.debug("Corrupt header detected => yend: %s", yend)
if not _partcrc == partcrc:
raise CrcError(_partcrc, partcrc, decoded_data)
else:
raise BadYenc()
# Parse filename if there was data
if decoded_data:
# Only set the name if it was found and not obfuscated
self.verify_filename(article, decoded_data, output_filename)
return decoded_data
def search_new_server(self, article):
# Search new server
article.add_to_try_list(article.fetcher)
for server in self.servers:
@@ -223,98 +320,42 @@ class Decoder(Thread):
msg = T('%s => missing from all servers, discarding') % article
logging.info(msg)
article.nzf.nzo.inc_log('missing_art_log', msg)
article.nzf.nzo.increase_bad_articles_counter('missing_articles')
return False
YDEC_TRANS = ''.join([chr((i + 256 - 42) % 256) for i in xrange(256)])
def decode(article, data, raw_data):
# Do we have SABYenc? Let it do all the work
if sabnzbd.decoder.SABYENC_ENABLED:
decoded_data, output_filename, crc, crc_expected, crc_correct = sabyenc.decode_usenet_chunks(raw_data, article.bytes)
# Assume it is yenc
article.nzf.type = 'yenc'
# Only set the name if it was found
if output_filename:
article.nzf.filename = output_filename
# CRC check
if not crc_correct:
raise CrcError(crc_expected, crc, decoded_data)
return decoded_data
# Continue for _yenc or Python-yEnc
# Filter out empty ones
data = filter(None, data)
# No point in continuing if we don't have any data left
if data:
def verify_filename(self, article, decoded_data, yenc_filename):
""" Verify the filename provided by yenc by using
par2 information and otherwise fall back to NZB name
"""
nzf = article.nzf
yenc, data = yCheck(data)
ybegin, ypart, yend = yenc
decoded_data = None
# Was this file already verified and did we get a name?
if nzf.filename_checked or not yenc_filename:
return
# Deal with non-yencoded posts
if not ybegin:
found = False
try:
for i in xrange(min(40, len(data))):
if data[i].startswith('begin '):
nzf.type = 'uu'
found = True
# Pause the job and show warning
if nzf.nzo.status != Status.PAUSED:
nzf.nzo.pause()
msg = T('UUencode detected, only yEnc encoding is supported [%s]') % nzf.nzo.final_name
logging.warning(msg)
break
except IndexError:
raise BadYenc()
# Set the md5-of-16k if this is the first article
if article.partnum == nzf.lowest_partnum:
nzf.md5of16k = hashlib.md5(decoded_data[:16384]).digest()
if found:
decoded_data = ''
else:
raise BadYenc()
# If we have the md5, use it to rename
if nzf.md5of16k:
# Don't check again, even if no match
nzf.filename_checked = True
# Find the match and rename
if nzf.md5of16k in nzf.nzo.md5of16k:
new_filename = platform_encode(nzf.nzo.md5of16k[nzf.md5of16k])
# Was it even new?
if new_filename != nzf.filename:
logging.info('Detected filename based on par2: %s -> %s', nzf.filename, new_filename)
nzf.nzo.renamed_file(new_filename, nzf.filename)
nzf.filename = new_filename
return
# Deal with yenc encoded posts
elif ybegin and yend:
if 'name' in ybegin:
nzf.filename = yenc_name_fixer(ybegin['name'])
else:
logging.debug("Possible corrupt header detected => ybegin: %s", ybegin)
nzf.type = 'yenc'
# Decode data
if HAVE_YENC:
decoded_data, crc = _yenc.decode_string(''.join(data))[:2]
partcrc = '%08X' % ((crc ^ -1) & 2 ** 32L - 1)
else:
data = ''.join(data)
for i in (0, 9, 10, 13, 27, 32, 46, 61):
j = '=%c' % (i + 64)
data = data.replace(j, chr(i))
decoded_data = data.translate(YDEC_TRANS)
crc = binascii.crc32(decoded_data)
partcrc = '%08X' % (crc & 2 ** 32L - 1)
if ypart:
crcname = 'pcrc32'
else:
crcname = 'crc32'
if crcname in yend:
_partcrc = yenc_name_fixer('0' * (8 - len(yend[crcname])) + yend[crcname].upper())
else:
_partcrc = None
logging.debug("Corrupt header detected => yend: %s", yend)
if not _partcrc == partcrc:
raise CrcError(_partcrc, partcrc, decoded_data)
else:
raise BadYenc()
return decoded_data
# Fallback to yenc/nzb name (also when there is no partnum=1)
# We also keep the NZB name in case it ends with ".par2" (usually correct)
if yenc_filename != nzf.filename and not is_obfuscated_filename(yenc_filename) and not nzf.filename.endswith('.par2'):
logging.info('Detected filename from yenc: %s -> %s', nzf.filename, yenc_filename)
nzf.nzo.renamed_file(yenc_filename, nzf.filename)
nzf.filename = yenc_filename
def yCheck(data):

387
sabnzbd/directunpacker.py Normal file
View File

@@ -0,0 +1,387 @@
#!/usr/bin/python -OO
# Copyright 2008-2017 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.directunpacker
"""
import os
import re
import time
import threading
import subprocess
import logging
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.misc import int_conv, clip_path, remove_all, globber, format_time_string, has_win_device
from sabnzbd.encoding import unicoder
from sabnzbd.newsunpack import build_command
from sabnzbd.postproc import prepare_extraction_path
from sabnzbd.utils.diskspeed import diskspeedmeasure
if sabnzbd.WIN32:
# Load the POpen from the fixed unicode-subprocess
from sabnzbd.utils.subprocess_fix import Popen
else:
# Load the regular POpen
from subprocess import Popen
MAX_ACTIVE_UNPACKERS = 10
ACTIVE_UNPACKERS = []
RAR_NR = re.compile(r'(.*?)(\.part(\d*).rar|\.r(\d*))$', re.IGNORECASE)
class DirectUnpacker(threading.Thread):
def __init__(self, nzo):
threading.Thread.__init__(self)
self.nzo = nzo
self.active_instance = None
self.killed = False
self.next_file_lock = threading.Condition(threading.RLock())
self.unpack_dir_info = None
self.cur_setname = None
self.cur_volume = 0
self.total_volumes = {}
self.unpack_time = 0.0
self.success_sets = []
self.next_sets = []
nzo.direct_unpacker = self
def stop(self):
pass
def save(self):
pass
def reset_active(self):
self.active_instance = None
self.cur_setname = None
self.cur_volume = 0
def check_requirements(self):
if not cfg.direct_unpack() or self.killed or not self.nzo.unpack or self.nzo.bad_articles or sabnzbd.newsunpack.RAR_PROBLEM:
return False
return True
def set_volumes_for_nzo(self):
""" Loop over all files to detect the names """
none_counter = 0
found_counter = 0
for nzf in self.nzo.files + self.nzo.finished_files:
nzf.setname, nzf.vol = analyze_rar_filename(nzf.filename)
# We matched?
if nzf.setname:
found_counter += 1
if nzf.setname not in self.total_volumes:
self.total_volumes[nzf.setname] = 0
self.total_volumes[nzf.setname] = max(self.total_volumes[nzf.setname], nzf.vol)
else:
none_counter += 1
# Too much not found? Obfuscated, ignore results
if none_counter > found_counter:
self.total_volumes = {}
def add(self, nzf):
""" Add jobs and start instance of DirectUnpack """
if not cfg.direct_unpack_tested():
test_disk_performance()
# Stop if something is wrong
if not self.check_requirements():
return
# Is this the first set?
if not self.cur_setname:
self.set_volumes_for_nzo()
self.cur_setname = nzf.setname
# Analyze updated filenames
nzf.setname, nzf.vol = analyze_rar_filename(nzf.filename)
# Are we doing this set?
if self.cur_setname == nzf.setname:
logging.debug('DirectUnpack queued %s for %s', nzf.filename, self.cur_setname)
# Is this the first one of the first set?
if not self.active_instance and not self.is_alive() and self.have_next_volume():
# Too many runners already?
if len(ACTIVE_UNPACKERS) >= cfg.direct_unpack_threads():
logging.info('Too many DirectUnpackers currently to start %s', self.cur_setname)
return
# Start the unrar command and the loop
self.create_unrar_instance()
self.start()
elif not any(test_nzf.setname == nzf.setname for test_nzf in self.next_sets):
# Need to store this for the future, only once per set!
self.next_sets.append(nzf)
# Wake up the thread to see if this is good to go
with self.next_file_lock:
self.next_file_lock.notify()
def run(self):
# Input and output
linebuf = ''
last_volume_linebuf = ''
unrar_log = []
start_time = time.time()
# Need to read char-by-char because there's no newline after new-disk message
while 1:
if not self.active_instance:
break
char = self.active_instance.stdout.read(1)
linebuf += char
if not char:
# End of program
break
# Error? Let PP-handle it
if linebuf.endswith(('ERROR: ', 'Cannot create', 'in the encrypted file', 'CRC failed', \
'checksum failed', 'You need to start extraction from a previous volume', \
'password is incorrect', 'Write error', 'checksum error', \
'start extraction from a previous volume')):
logging.info('Error in DirectUnpack of %s', self.cur_setname)
self.abort()
# Did we reach the end?
if linebuf.endswith('All OK'):
# Stop timer and finish
self.unpack_time += time.time() - start_time
ACTIVE_UNPACKERS.remove(self)
# Add to success
self.success_sets.append(self.cur_setname)
logging.info('DirectUnpack completed for %s', self.cur_setname)
self.nzo.set_action_line(T('Unpacking'), T('Completed'))
# Write current log
unrar_log.append(linebuf.strip())
linebuf = ''
logging.debug('DirectUnpack Unrar output %s', '\n'.join(unrar_log))
unrar_log = []
# Are there more files left?
while self.nzo.files and not self.next_sets:
with self.next_file_lock:
self.next_file_lock.wait()
# Is there another set to do?
if self.next_sets:
# Start new instance
nzf = self.next_sets.pop(0)
self.reset_active()
self.cur_setname = nzf.setname
# Wait for the 1st volume to appear
self.wait_for_next_volume()
self.create_unrar_instance()
start_time = time.time()
else:
self.killed = True
break
if linebuf.endswith('[C]ontinue, [Q]uit '):
# Stop timer
self.unpack_time += time.time() - start_time
# Wait for the next one..
self.wait_for_next_volume()
# Possible that the instance was deleted while locked
if not self.killed:
# Give unrar some time to do it's thing
self.active_instance.stdin.write('\n')
start_time = time.time()
time.sleep(0.1)
# Did we unpack a new volume? Sometimes UnRar hangs on 1 volume
if not last_volume_linebuf or last_volume_linebuf != linebuf:
# Next volume
self.cur_volume += 1
self.nzo.set_action_line(T('Unpacking'), self.get_formatted_stats())
logging.info('DirectUnpacked volume %s for %s', self.cur_volume, self.cur_setname)
last_volume_linebuf = linebuf
if linebuf.endswith('\n'):
unrar_log.append(linebuf.strip())
linebuf = ''
# Add last line
unrar_log.append(linebuf.strip())
logging.debug('DirectUnpack Unrar output %s', '\n'.join(unrar_log))
# Save information if success
if self.success_sets:
msg = T('Unpacked %s files/folders in %s') % (len(globber(self.unpack_dir_info[0])), format_time_string(self.unpack_time))
msg = '%s - %s' % (T('Direct Unpack'), msg)
self.nzo.set_unpack_info('Unpack', '[%s] %s' % (unicoder(self.cur_setname), msg))
# Make more space
self.reset_active()
if self in ACTIVE_UNPACKERS:
ACTIVE_UNPACKERS.remove(self)
def have_next_volume(self):
""" Check if next volume of set is available, start
from the end of the list where latest completed files are """
for nzf_search in reversed(self.nzo.finished_files):
if nzf_search.setname == self.cur_setname and nzf_search.vol == (self.cur_volume+1):
return nzf_search
return False
def wait_for_next_volume(self):
""" Wait for the correct volume to appear
But stop if it was killed or the NZB is done """
while not self.have_next_volume() and not self.killed and self.nzo.files:
with self.next_file_lock:
self.next_file_lock.wait()
def create_unrar_instance(self):
""" Start the unrar instance using the user's options """
# Generate extraction path and save for post-proc
if not self.unpack_dir_info:
self.unpack_dir_info = prepare_extraction_path(self.nzo)
extraction_path, _, _, one_folder, _ = self.unpack_dir_info
# Set options
if self.nzo.password:
password_command = '-p%s' % self.nzo.password
else:
password_command = '-p-'
if one_folder or cfg.flat_unpack():
action = 'e'
else:
action = 'x'
# The first NZF
rarfile_nzf = self.have_next_volume()
# Generate command
rarfile_path = os.path.join(self.nzo.downpath, rarfile_nzf.filename)
if sabnzbd.WIN32:
if not has_win_device(rarfile_path):
command = ['%s' % sabnzbd.newsunpack.RAR_COMMAND, action, '-vp', '-idp', '-o+', '-ai', password_command,
'%s' % clip_path(rarfile_path), clip_path(extraction_path)]
else:
# Need long-path notation in case of forbidden-names
command = ['%s' % sabnzbd.newsunpack.RAR_COMMAND, action, '-vp', '-idp', '-o+', '-ai', password_command,
'%s' % clip_path(rarfile_path), '%s\\' % extraction_path]
else:
# Don't use "-ai" (not needed for non-Windows)
command = ['%s' % sabnzbd.newsunpack.RAR_COMMAND, action, '-vp', '-idp', '-o+', password_command,
'%s' % rarfile_path, '%s/' % extraction_path]
if cfg.ignore_unrar_dates():
command.insert(3, '-tsm-')
# Let's start from the first one!
self.cur_volume = 1
stup, need_shell, command, creationflags = build_command(command)
logging.debug('Running unrar for DirectUnpack %s', command)
self.active_instance = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
# Add to runners
ACTIVE_UNPACKERS.append(self)
# Doing the first
logging.info('DirectUnpacked volume %s for %s', self.cur_volume, self.cur_setname)
def abort(self):
""" Abort running instance and delete generated files """
if not self.killed:
logging.info('Aborting DirectUnpack for %s', self.cur_setname)
self.killed = True
# Abort Unrar
if self.active_instance:
self.active_instance.kill()
# We need to wait for it to kill the process
self.active_instance.wait()
# Wake up the thread
with self.next_file_lock:
self.next_file_lock.notify()
# No new sets
self.next_sets = []
self.success_sets = []
# Remove files
if self.unpack_dir_info:
extraction_path, _, _, _, _ = self.unpack_dir_info
remove_all(extraction_path, recursive=True)
# Remove dir-info
self.unpack_dir_info = None
# Reset settings
self.reset_active()
def get_formatted_stats(self):
""" Get percentage or number of rar's done """
if self.cur_setname and self.cur_setname in self.total_volumes:
# This won't work on obfuscated posts
if self.total_volumes[self.cur_setname] >= self.cur_volume and self.cur_volume:
return '%02d/%02d' % (self.cur_volume, self.total_volumes[self.cur_setname])
return self.cur_volume
def analyze_rar_filename(filename):
""" Extract volume number and setname from rar-filenames
Both ".part01.rar" and ".r01" """
m = RAR_NR.search(filename)
if m:
if m.group(4):
# Special since starts with ".rar", ".r00"
return m.group(1), int_conv(m.group(4)) + 2
return m.group(1), int_conv(m.group(3))
else:
# Detect if first of "rxx" set
if filename.endswith('.rar') and '.part' not in filename:
return os.path.splitext(filename)[0], 1
return None, None
def abort_all():
""" Abort all running DirectUnpackers """
logging.info('Aborting all DirectUnpackers')
for direct_unpacker in ACTIVE_UNPACKERS:
direct_unpacker.abort()
def test_disk_performance():
""" Test the incomplete-dir performance and enable
Direct Unpack if good enough (> 40MB/s)
"""
if diskspeedmeasure(sabnzbd.cfg.download_dir.get_path()) > 40:
cfg.direct_unpack.set(True)
logging.warning(T('Direct Unpack was automatically enabled.') + ' ' + T('Jobs will start unpacking during the downloading to reduce post-processing time. Only works for jobs that do not need repair.'))
else:
logging.info('Direct Unpack was not enabled, incomplete folder disk speed below 40MB/s')
cfg.direct_unpack_tested.set(True)
config.save_config()

View File

@@ -29,6 +29,7 @@ from sabnzbd.constants import *
import sabnzbd
from sabnzbd.misc import to_units, split_host, time_format
from sabnzbd.encoding import EmailFilter
from sabnzbd.notifier import check_cat
import sabnzbd.cfg as cfg
@@ -216,6 +217,9 @@ def send_with_template(prefix, parm, test=None):
def endjob(filename, cat, status, path, bytes, fail_msg, stages, script, script_output, script_ret, test=None):
""" Send end-of-job email """
# Is it allowed?
if not check_cat('email', cat):
return None
# Translate the stage names
tr = sabnzbd.api.Ttemplate

View File

@@ -1300,7 +1300,7 @@ class ConfigFolders(object):
##############################################################################
SWITCH_LIST = \
('par2_multicore', 'par_option', 'top_only', 'ssl_ciphers',
('par_option', 'top_only', 'ssl_ciphers', 'direct_unpack',
'auto_sort', 'propagation_delay', 'auto_disconnect', 'flat_unpack',
'safe_postproc', 'no_dupes', 'replace_spaces', 'replace_dots',
'ignore_samples', 'pause_on_post_processing', 'nice', 'ionice',
@@ -1376,16 +1376,16 @@ class ConfigSwitches(object):
SPECIAL_BOOL_LIST = \
('start_paused', 'no_penalties', 'ignore_wrong_unrar', 'overwrite_files', 'enable_par_cleanup',
'queue_complete_pers', 'api_warnings', 'ampm', 'enable_unrar', 'enable_unzip', 'enable_7zip',
'enable_filejoin', 'enable_tsjoin', 'allow_streaming', 'ignore_unrar_dates', 'par2_multicore',
'enable_filejoin', 'enable_tsjoin', 'ignore_unrar_dates',
'multipar', 'osx_menu', 'osx_speed', 'win_menu', 'use_pickle', 'allow_incomplete_nzb',
'rss_filenames', 'ipv6_hosting', 'keep_awake', 'empty_postproc', 'html_login', 'wait_for_dfolder',
'max_art_opt', 'warn_empty_nzb', 'enable_bonjour','allow_duplicate_files', 'warn_dupl_jobs',
'replace_illegal', 'backup_for_duplicates', 'disable_api_key', 'api_logging', 'enable_meta',
'replace_illegal', 'backup_for_duplicates', 'disable_api_key', 'api_logging',
)
SPECIAL_VALUE_LIST = \
('size_limit', 'folder_max_length', 'fsys_type', 'movie_rename_limit', 'nomedia_marker',
'req_completion_rate', 'wait_ext_drive', 'history_limit', 'show_sysload',
'ipv6_servers', 'selftest_host', 'rating_host'
'direct_unpack_threads', 'ipv6_servers', 'selftest_host', 'rating_host'
)
SPECIAL_LIST_LIST = ('rss_odd_titles', 'quick_check_ext_ignore')
@@ -2695,39 +2695,39 @@ def GetRssLog(feed):
##############################################################################
LIST_EMAIL = (
'email_endjob', 'email_full',
'email_endjob', 'email_cats', 'email_full',
'email_server', 'email_to', 'email_from',
'email_account', 'email_pwd', 'email_dir', 'email_rss'
)
LIST_GROWL = ('growl_enable', 'growl_server', 'growl_password',
LIST_GROWL = ('growl_enable', 'growl_cats', 'growl_server', 'growl_password',
'growl_prio_startup', 'growl_prio_download', 'growl_prio_pp', 'growl_prio_complete', 'growl_prio_failed',
'growl_prio_disk_full', 'growl_prio_warning', 'growl_prio_error', 'growl_prio_queue_done', 'growl_prio_other',
'growl_prio_new_login')
LIST_NCENTER = ('ncenter_enable',
LIST_NCENTER = ('ncenter_enable', 'ncenter_cats',
'ncenter_prio_startup', 'ncenter_prio_download', 'ncenter_prio_pp', 'ncenter_prio_complete', 'ncenter_prio_failed',
'ncenter_prio_disk_full', 'ncenter_prio_warning', 'ncenter_prio_error', 'ncenter_prio_queue_done', 'ncenter_prio_other',
'ncenter_prio_new_login')
LIST_ACENTER = ('acenter_enable',
LIST_ACENTER = ('acenter_enable', 'acenter_cats',
'acenter_prio_startup', 'acenter_prio_download', 'acenter_prio_pp', 'acenter_prio_complete', 'acenter_prio_failed',
'acenter_prio_disk_full', 'acenter_prio_warning', 'acenter_prio_error', 'acenter_prio_queue_done', 'acenter_prio_other',
'acenter_prio_new_login')
LIST_NTFOSD = ('ntfosd_enable',
LIST_NTFOSD = ('ntfosd_enable', 'ntfosd_cats',
'ntfosd_prio_startup', 'ntfosd_prio_download', 'ntfosd_prio_pp', 'ntfosd_prio_complete', 'ntfosd_prio_failed',
'ntfosd_prio_disk_full', 'ntfosd_prio_warning', 'ntfosd_prio_error', 'ntfosd_prio_queue_done', 'ntfosd_prio_other',
'ntfosd_prio_new_login')
LIST_PROWL = ('prowl_enable', 'prowl_apikey',
LIST_PROWL = ('prowl_enable', 'prowl_cats', 'prowl_apikey',
'prowl_prio_startup', 'prowl_prio_download', 'prowl_prio_pp', 'prowl_prio_complete', 'prowl_prio_failed',
'prowl_prio_disk_full', 'prowl_prio_warning', 'prowl_prio_error', 'prowl_prio_queue_done', 'prowl_prio_other',
'prowl_prio_new_login')
LIST_PUSHOVER = ('pushover_enable', 'pushover_token', 'pushover_userkey', 'pushover_device',
LIST_PUSHOVER = ('pushover_enable', 'pushover_cats', 'pushover_token', 'pushover_userkey', 'pushover_device',
'pushover_prio_startup', 'pushover_prio_download', 'pushover_prio_pp', 'pushover_prio_complete', 'pushover_prio_failed',
'pushover_prio_disk_full', 'pushover_prio_warning', 'pushover_prio_error', 'pushover_prio_queue_done', 'pushover_prio_other',
'pushover_prio_new_login')
LIST_PUSHBULLET = ('pushbullet_enable', 'pushbullet_apikey', 'pushbullet_device',
LIST_PUSHBULLET = ('pushbullet_enable', 'pushbullet_cats', 'pushbullet_apikey', 'pushbullet_device',
'pushbullet_prio_startup', 'pushbullet_prio_download', 'pushbullet_prio_pp', 'pushbullet_prio_complete', 'pushbullet_prio_failed',
'pushbullet_prio_disk_full', 'pushbullet_prio_warning', 'pushbullet_prio_error', 'pushbullet_prio_queue_done', 'pushbullet_prio_other',
'pushbullet_prio_new_login')
LIST_NSCRIPT = ('nscript_enable', 'nscript_script', 'nscript_parameters',
LIST_NSCRIPT = ('nscript_enable', 'nscript_cats', 'nscript_script', 'nscript_parameters',
'nscript_prio_startup', 'nscript_prio_download', 'nscript_prio_pp', 'nscript_prio_complete', 'nscript_prio_failed',
'nscript_prio_disk_full', 'nscript_prio_warning', 'nscript_prio_error', 'nscript_prio_queue_done', 'nscript_prio_other',
'nscript_prio_new_login')
@@ -2749,6 +2749,7 @@ class ConfigNotify(object):
conf = build_header(sabnzbd.WEB_DIR_CONFIG)
conf['my_home'] = sabnzbd.DIR_HOME
conf['categories'] = list_cats(False)
conf['lastmail'] = self.__lastmail
conf['have_growl'] = True
conf['have_ntfosd'] = sabnzbd.notifier.have_ntfosd()

View File

@@ -247,10 +247,11 @@ def replace_win_devices(name):
def has_win_device(p):
""" Return True if filename part contains forbidden name
Before and after sanitizing
"""
p = os.path.split(p)[1].lower()
for dev in _DEVICES:
if p == dev or p.startswith(dev + '.'):
if p == dev or p.startswith(dev + '.') or p.startswith('_' + dev + '.'):
return True
return False
@@ -264,7 +265,7 @@ else:
CH_LEGAL = '+'
def sanitize_filename(name, allow_win_devices=False):
def sanitize_filename(name):
""" Return filename with illegal chars converted to legal ones
and with the par2 extension always in lowercase
"""
@@ -281,7 +282,7 @@ def sanitize_filename(name, allow_win_devices=False):
# Compensate for the foolish way par2 on OSX handles a colon character
name = name[name.rfind(':') + 1:]
if sabnzbd.WIN32 and not allow_win_devices:
if sabnzbd.WIN32 or cfg.sanitize_safe():
name = replace_win_devices(name)
lst = []
@@ -397,20 +398,11 @@ def sanitize_files_in_folder(folder):
return lst
def flag_file(path, flag, create=False):
""" Create verify flag file or return True if it already exists """
path = os.path.join(path, JOB_ADMIN)
path = os.path.join(path, flag)
if create:
try:
f = open(path, 'w')
f.write('ok\n')
f.close()
return True
except IOError:
return False
else:
return os.path.exists(path)
def is_obfuscated_filename(filename):
""" Check if this file has an extension, if not, it's
probably obfuscated and we don't use it
"""
return (os.path.splitext(filename)[1] == '')
##############################################################################

View File

@@ -24,7 +24,7 @@ import sys
import re
import subprocess
import logging
from time import time
import time
import binascii
import shutil
@@ -32,11 +32,11 @@ import sabnzbd
from sabnzbd.encoding import TRANS, UNTRANS, unicoder, platform_encode, deunicode
import sabnzbd.utils.rarfile as rarfile
from sabnzbd.misc import format_time_string, find_on_path, make_script_path, int_conv, \
flag_file, real_path, globber, globber_full, get_all_passwords, renamer, clip_path, \
real_path, globber, globber_full, get_all_passwords, renamer, clip_path, \
has_win_device, calc_age
from sabnzbd.tvsort import SeriesSorter
import sabnzbd.cfg as cfg
from sabnzbd.constants import Status, QCHECK_FILE, RENAMES_FILE
from sabnzbd.constants import Status
if sabnzbd.WIN32:
try:
@@ -45,15 +45,18 @@ if sabnzbd.WIN32:
from win32process import STARTF_USESHOWWINDOW, IDLE_PRIORITY_CLASS
except ImportError:
pass
# Load the POpen from the fixed unicode-subprocess
from sabnzbd.utils.subprocess_fix import Popen
else:
# Define dummy WindowsError for non-Windows
class WindowsError(Exception):
def __init__(self, value):
self.parameter = value
def __str__(self):
return repr(self.parameter)
# Load the regular POpen
from subprocess import Popen
# Regex globals
RAR_RE = re.compile(r'\.(?P<ext>part\d*\.rar|rar|r\d\d|s\d\d|t\d\d|u\d\d|v\d\d|\d\d\d)$', re.I)
@@ -70,7 +73,6 @@ FULLVOLPAR2_RE = re.compile(r'(.*[^.])(\.*vol[0-9]+\+[0-9]+\.par2)', re.I)
TS_RE = re.compile(r'\.(\d+)\.(ts$)', re.I)
PAR2_COMMAND = None
PAR2C_COMMAND = None
MULTIPAR_COMMAND = None
RAR_COMMAND = None
NICE_COMMAND = None
@@ -92,7 +94,6 @@ def find_programs(curdir):
return None
if sabnzbd.DARWIN:
sabnzbd.newsunpack.PAR2C_COMMAND = check(curdir, 'osx/par2/par2-classic')
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, 'osx/par2/par2-sl64')
sabnzbd.newsunpack.RAR_COMMAND = check(curdir, 'osx/unrar/unrar')
sabnzbd.newsunpack.SEVEN_COMMAND = check(curdir, 'osx/7zip/7za')
@@ -100,15 +101,13 @@ def find_programs(curdir):
if sabnzbd.WIN32:
if sabnzbd.WIN64:
# 64 bit versions
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, 'win/par2/x64/par2.exe')
sabnzbd.newsunpack.MULTIPAR_COMMAND = check(curdir, 'win/par2/multipar/par2j64.exe')
sabnzbd.newsunpack.RAR_COMMAND = check(curdir, 'win/unrar/x64/UnRAR.exe')
else:
# 32 bit versions
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, 'win/par2/par2.exe')
sabnzbd.newsunpack.MULTIPAR_COMMAND = check(curdir, 'win/par2/multipar/par2j.exe')
sabnzbd.newsunpack.RAR_COMMAND = check(curdir, 'win/unrar/UnRAR.exe')
sabnzbd.newsunpack.PAR2C_COMMAND = check(curdir, 'win/par2/par2cmdline.exe')
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, 'win/par2/par2.exe')
sabnzbd.newsunpack.ZIP_COMMAND = check(curdir, 'win/unzip/unzip.exe')
sabnzbd.newsunpack.SEVEN_COMMAND = check(curdir, 'win/7zip/7za.exe')
else:
@@ -125,9 +124,6 @@ def find_programs(curdir):
if not sabnzbd.newsunpack.SEVEN_COMMAND:
sabnzbd.newsunpack.SEVEN_COMMAND = find_on_path('7z')
if not sabnzbd.newsunpack.PAR2C_COMMAND:
sabnzbd.newsunpack.PAR2C_COMMAND = sabnzbd.newsunpack.PAR2_COMMAND
if not (sabnzbd.WIN32 or sabnzbd.DARWIN):
# Run check on rar version
version, original = unrar_check(sabnzbd.newsunpack.RAR_COMMAND)
@@ -166,7 +162,7 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
logging.info('Running external script %s(%s, %s, %s, %s, %s, %s, %s, %s)',
extern_proc, complete_dir, nzo.filename, nicename, '', nzo.cat, nzo.group, status, failure_url)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, env=env, creationflags=creationflags)
@@ -207,7 +203,7 @@ def external_script(script, p1, p2, p3=None, p4=None):
stup, need_shell, command, creationflags = build_command(command)
env = create_env()
logging.info('Running user script %s(%s, %s)', script, p1, p2)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, env=env, creationflags=creationflags)
except:
@@ -475,27 +471,44 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
else:
extraction_path = os.path.split(rarpath)[0]
logging.info("Extracting rarfile %s (belonging to %s) to %s",
rarpath, rar_set, extraction_path)
# Is the direct-unpacker still running? We wait for it
if nzo.direct_unpacker:
while nzo.direct_unpacker.is_alive():
logging.debug('DirectUnpacker still alive for %s', nzo)
# Bump the file-lock in case it's stuck
with nzo.direct_unpacker.next_file_lock:
nzo.direct_unpacker.next_file_lock.notify()
time.sleep(2)
try:
fail, newfiles, rars = rar_extract(rarpath, len(rar_sets[rar_set]),
one_folder, nzo, rar_set, extraction_path)
# Was it aborted?
if not nzo.pp_active:
fail = 1
break
success = not fail
except:
success = False
fail = True
msg = sys.exc_info()[1]
nzo.fail_msg = T('Unpacking failed, %s') % msg
setname = nzo.final_name
nzo.set_unpack_info('Unpack', T('[%s] Error "%s" while unpacking RAR files') % (unicoder(setname), msg))
# Did we already direct-unpack it? Not when recursive-unpacking
if nzo.direct_unpacker and rar_set in nzo.direct_unpacker.success_sets:
logging.info("Set %s completed by DirectUnpack", rar_set)
fail = 0
success = 1
rars = rar_sets[rar_set]
newfiles = globber(extraction_path)
nzo.direct_unpacker.success_sets.remove(rar_set)
else:
logging.info("Extracting rarfile %s (belonging to %s) to %s",
rarpath, rar_set, extraction_path)
try:
fail, newfiles, rars = rar_extract(rarpath, len(rar_sets[rar_set]),
one_folder, nzo, rar_set, extraction_path)
# Was it aborted?
if not nzo.pp_active:
fail = 1
break
success = not fail
except:
success = False
fail = True
msg = sys.exc_info()[1]
nzo.fail_msg = T('Unpacking failed, %s') % msg
setname = nzo.final_name
nzo.set_unpack_info('Unpack', T('[%s] Error "%s" while unpacking RAR files') % (unicoder(setname), msg))
logging.error(T('Error "%s" while running rar_unpack on %s'), msg, setname)
logging.debug("Traceback: ", exc_info=True)
logging.error(T('Error "%s" while running rar_unpack on %s'), msg, setname)
logging.debug("Traceback: ", exc_info=True)
if success:
logging.debug('rar_unpack(): Rars: %s', rars)
@@ -530,7 +543,6 @@ def rar_extract(rarfile_path, numrars, one_folder, nzo, setname, extraction_path
with password tries
Return fail==0(ok)/fail==1(error)/fail==2(wrong password), new_files, rars
"""
fail = 0
new_files = None
rars = []
@@ -555,7 +567,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
""" Unpack single rar set 'rarfile_path' to 'extraction_path'
Return fail==0(ok)/fail==1(error)/fail==2(wrong password)/fail==3(crc-error), new_files, rars
"""
start = time()
start = time.time()
logging.debug("rar_extract(): Extractionpath: %s", extraction_path)
@@ -579,13 +591,14 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
if sabnzbd.WIN32:
# Use all flags
# See: https://github.com/sabnzbd/sabnzbd/pull/771
command = ['%s' % RAR_COMMAND, action, '-idp', overwrite, rename, '-ai', password_command,
'%s' % clip_path(rarfile_path), '%s\\' % extraction_path]
if not has_win_device(rarfile_path):
command = ['%s' % RAR_COMMAND, action, '-idp', overwrite, rename, '-ai', password_command,
'%s' % clip_path(rarfile_path), clip_path(extraction_path)]
else:
# Need long-path notation in case of forbidden-names
command = ['%s' % RAR_COMMAND, action, '-idp', overwrite, rename, '-ai', password_command,
'%s' % clip_path(rarfile_path), '%s\\' % extraction_path]
# If this is the retry without leading \\.\, we need to remove the \ at the end (yes..)
if not extraction_path.startswith('\\\\?\\'):
command[-1] = command[-1][:-1]
elif RAR_PROBLEM:
# Use only oldest options (specifically no "-or")
command = ['%s' % RAR_COMMAND, action, '-idp', overwrite, password_command,
@@ -600,9 +613,12 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
stup, need_shell, command, creationflags = build_command(command)
# Get list of all the volumes part of this set
logging.debug("Analyzing rar file ... %s found", rarfile.is_rarfile(rarfile_path))
rarfiles = rarfile.RarFile(rarfile_path).volumelist()
logging.debug("Running unrar %s", command)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -615,7 +631,6 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
# Loop over the output from rar!
curr = 0
extracted = []
rarfiles = []
fail = 0
inrecovery = False
lines = []
@@ -638,9 +653,6 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
lines.append(line)
if line.startswith('Extracting from'):
filename = TRANS((re.search(EXTRACTFROM_RE, line).group(1)))
if filename not in rarfiles:
rarfiles.append(filename)
curr += 1
nzo.set_action_line(T('Unpacking'), '%02d/%02d' % (curr, numrars))
@@ -683,12 +695,6 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
logging.error(T('ERROR: write error (%s)'), line[11:])
fail = 1
elif line.startswith('Cannot create') and sabnzbd.WIN32 and extraction_path.startswith('\\\\?\\'):
# Can be due to Unicode problems on Windows, let's retry
fail = 4
# Kill the process (can stay in endless loop on Windows Server)
p.kill()
elif line.startswith('Cannot create'):
line2 = proc.readline()
if 'must not exceed 260' in line2:
@@ -701,6 +707,8 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
logging.error(T('ERROR: write error (%s)'), unicoder(line[13:]))
nzo.set_unpack_info('Unpack', unicoder(msg))
fail = 1
# Kill the process (can stay in endless loop on Windows Server)
p.kill()
elif line.startswith('ERROR: '):
nzo.fail_msg = T('Unpacking failed, see log')
@@ -763,13 +771,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
proc.close()
p.wait()
logging.debug('UNRAR output %s', '\n'.join(lines))
# Unicode problems, lets start again but now we try without \\?\
# This will only fail if the download contains forbidden-Windows-names
if fail == 4:
return rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, clip_path(extraction_path), password)
else:
return fail, (), ()
return fail, (), ()
if proc:
proc.close()
@@ -777,7 +779,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
logging.debug('UNRAR output %s', '\n'.join(lines))
nzo.fail_msg = ''
msg = T('Unpacked %s files/folders in %s') % (str(len(extracted)), format_time_string(time() - start))
msg = T('Unpacked %s files/folders in %s') % (str(len(extracted)), format_time_string(time.time() - start))
nzo.set_unpack_info('Unpack', '[%s] %s' % (unicoder(setname), msg))
logging.info('%s', msg)
@@ -795,11 +797,11 @@ def unzip(nzo, workdir, workdir_complete, delete, one_folder, zips):
try:
i = 0
unzip_failed = False
tms = time()
tms = time.time()
for _zip in zips:
logging.info("Starting extract on zipfile: %s ", _zip)
nzo.set_action_line(T('Unpacking'), '%s' % unicoder(_zip))
nzo.set_action_line(T('Unpacking'), '%s' % unicoder(os.path.basename(_zip)))
if workdir_complete and _zip.startswith(workdir):
extraction_path = workdir_complete
@@ -811,7 +813,7 @@ def unzip(nzo, workdir, workdir_complete, delete, one_folder, zips):
else:
i += 1
msg = T('%s files in %s') % (str(i), format_time_string(time() - tms))
msg = T('%s files in %s') % (str(i), format_time_string(time.time() - tms))
nzo.set_unpack_info('Unpack', msg)
# Delete the old files if we have to
@@ -854,7 +856,7 @@ def ZIP_Extract(zipfile, extraction_path, one_folder):
stup, need_shell, command, creationflags = build_command(command)
logging.debug('Starting unzip: %s', command)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -875,7 +877,7 @@ def unseven(nzo, workdir, workdir_complete, delete, one_folder, sevens):
"""
i = 0
unseven_failed = False
tms = time()
tms = time.time()
# Find multi-volume sets, because 7zip will not provide actual set members
sets = {}
@@ -909,7 +911,7 @@ def unseven(nzo, workdir, workdir_complete, delete, one_folder, sevens):
i += 1
if not unseven_failed:
msg = T('%s files in %s') % (str(i), format_time_string(time() - tms))
msg = T('%s files in %s') % (str(i), format_time_string(time.time() - tms))
nzo.set_unpack_info('Unpack', msg)
return unseven_failed
@@ -976,7 +978,7 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
stup, need_shell, command, creationflags = build_command(command)
logging.debug('Starting 7za: %s', command)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -1020,6 +1022,9 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
if os.path.exists(test_parfile):
parfile_nzf = new_par
break
else:
# No file was found, we assume this set already finished
return False, True
# Shorten just the workdir on Windows
parfile = os.path.join(workdir, parfile_nzf.filename)
@@ -1053,7 +1058,6 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
return readd, result
if not result:
flag_file(workdir, QCHECK_FILE, True)
nzo.status = Status.REPAIRING
result = False
readd = False
@@ -1073,8 +1077,7 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
if finished:
result = True
logging.info('Par verify finished ok on %s!',
parfile)
logging.info('Par verify finished ok on %s!', parfile)
# Remove this set so we don't try to check it again
nzo.remove_parset(parfile_nzf.setname)
@@ -1097,43 +1100,18 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
try:
if cfg.enable_par_cleanup():
deletables = []
new_dir_content = os.listdir(workdir)
# Remove extra files created during repair and par2 base files
for path in new_dir_content:
if os.path.splitext(path)[1] == '.1' and path not in old_dir_content:
try:
path = os.path.join(workdir, path)
deletables.append(os.path.join(workdir, path))
deletables.append(os.path.join(workdir, setname + '.par2'))
deletables.append(os.path.join(workdir, setname + '.PAR2'))
deletables.append(parfile)
logging.info("Deleting %s", path)
os.remove(path)
except:
logging.warning(T('Deleting %s failed!'), path)
path = os.path.join(workdir, setname + '.par2')
path2 = os.path.join(workdir, setname + '.PAR2')
if os.path.exists(path):
try:
logging.info("Deleting %s", path)
os.remove(path)
except:
logging.warning(T('Deleting %s failed!'), path)
if os.path.exists(path2):
try:
logging.info("Deleting %s", path2)
os.remove(path2)
except:
logging.warning(T('Deleting %s failed!'), path2)
if os.path.exists(parfile):
try:
logging.info("Deleting %s", parfile)
os.remove(parfile)
except OSError:
logging.warning(T('Deleting %s failed!'), parfile)
deletables = []
# Add output of par2-repair to remove
deletables.extend(used_joinables)
deletables.extend([os.path.join(workdir, f) for f in used_for_repair])
@@ -1164,36 +1142,17 @@ _RE_LOADING_PAR2 = re.compile(r'Loading "([^"]+)"\.')
_RE_LOADED_PAR2 = re.compile(r'Loaded (\d+) new packets')
def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, single=False):
def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
""" Run par2 on par-set """
retry_classic = False
used_joinables = []
used_for_repair = []
extra_par2_name = None
# set the current nzo status to "Verifying...". Used in History
nzo.status = Status.VERIFYING
start = time()
start = time.time()
options = cfg.par_option().strip()
classic = classic or not cfg.par2_multicore()
if sabnzbd.WIN32:
# If filenames are UTF-8 then we must use par2-tbb, unless this is a retry with classic
tbb = (sabnzbd.assembler.GetMD5Hashes(parfile, True)[1] and not classic) or not PAR2C_COMMAND
else:
tbb = False
if tbb and options:
command = [str(PAR2_COMMAND), 'r', options, parfile]
else:
if classic:
command = [str(PAR2C_COMMAND), 'r', parfile]
else:
command = [str(PAR2_COMMAND), 'r', parfile]
# Allow options if not classic or when classic and non-classic are the same
if (not classic or (PAR2_COMMAND == PAR2C_COMMAND)):
command.insert(2, options)
logging.debug('Par2-classic/cmdline = %s', classic)
command = [str(PAR2_COMMAND), 'r', options, parfile]
# Append the wildcard for this set
parfolder = os.path.split(parfile)[0]
@@ -1214,7 +1173,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
# We need to check for the bad par2cmdline that skips blocks
# Or the one that complains about basepath
# Only if we're not doing multicore
if not tbb:
if not sabnzbd.WIN32 and not sabnzbd.DARWIN:
par2text = run_simple([command[0], '-h'])
if 'No data skipping' in par2text:
logging.info('Detected par2cmdline version that skips blocks, adding -N parameter')
@@ -1227,19 +1186,15 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
stup, need_shell, command, creationflags = build_command(command)
# par2multicore wants to see \\.\ paths on Windows
# But par2cmdline doesn't support that notation, or \\?\ notation
# See: https://github.com/sabnzbd/sabnzbd/pull/771
if sabnzbd.WIN32 and (tbb or has_win_device(parfile)):
if sabnzbd.WIN32:
command = [x.replace('\\\\?\\', '\\\\.\\', 1) if x.startswith('\\\\?\\') else x for x in command]
elif sabnzbd.WIN32:
# For par2cmdline on Windows we need clipped paths
command = [clip_path(x) if x.startswith('\\\\?\\') else x for x in command]
# Run the external command
logging.debug('Starting par2: %s', command)
lines = []
try:
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -1310,18 +1265,18 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
logging.error(msg)
elif line.startswith('All files are correct'):
msg = T('[%s] Verified in %s, all files correct') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Verified in %s, all files correct') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Verified in %s, all files correct',
format_time_string(time() - start))
format_time_string(time.time() - start))
finished = 1
elif line.startswith('Repair is required'):
msg = T('[%s] Verified in %s, repair is required') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Verified in %s, repair is required') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Verified in %s, repair is required',
format_time_string(time() - start))
start = time()
format_time_string(time.time() - start))
start = time.time()
verified = 1
elif line.startswith('Loading "'):
@@ -1362,22 +1317,6 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
nzo.status = Status.FAILED
elif line.startswith('You need'):
# Because par2cmdline doesn't handle split files correctly
# if there are joinables, let's join them first and try again
# Only when in the par2-detection also only 1 output-file was mentioned
if joinables and len(datafiles) == 1:
error, newf = file_join(nzo, parfolder, parfolder, True, joinables)
# Only do it again if we had a good join
if newf:
retry_classic = True
# Save the renames in case of retry
for jn in joinables:
renames[datafiles[0]] = os.path.split(jn)[1]
joinables = []
# Need to set it to 1 so the renames get saved
finished = 1
break
chunks = line.split()
needed_blocks = int(chunks[2])
avail_blocks = 0
@@ -1447,7 +1386,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
nzo.status = Status.FAILED
elif line.startswith('Repair is possible'):
start = time()
start = time.time()
nzo.set_action_line(T('Repairing'), '%2d%%' % (0))
elif line.startswith('Repairing:'):
@@ -1457,9 +1396,9 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
nzo.status = Status.REPAIRING
elif line.startswith('Repair complete'):
msg = T('[%s] Repaired in %s') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Repaired in %s') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Repaired in %s', format_time_string(time() - start))
logging.info('Repaired in %s', format_time_string(time.time() - start))
finished = 1
elif line.startswith('File:') and line.find('data blocks from') > 0:
@@ -1483,27 +1422,20 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
logging.debug('PAR2 will reconstruct "%s" from "%s"', new_name, old_name)
reconstructed.append(os.path.join(workdir, old_name))
elif 'Could not write' in line and 'at offset 0:' in line and not classic:
elif 'Could not write' in line and 'at offset 0:' in line:
# If there are joinables, this error will only happen in case of 100% complete files
# We can just skip the retry, because par2cmdline will fail in those cases
# becauses it refuses to scan the ".001" file
if joinables:
finished = 1
used_joinables = []
else:
# Hit a bug in par2-tbb, retry with par2-classic
retry_classic = sabnzbd.WIN32
elif ' cannot be renamed to ' in line:
if not classic and sabnzbd.WIN32:
# Hit a bug in par2-tbb, retry with par2-classic
retry_classic = True
else:
msg = unicoder(line.strip())
nzo.fail_msg = msg
msg = u'[%s] %s' % (unicoder(setname), msg)
nzo.set_unpack_info('Repair', msg)
nzo.status = Status.FAILED
msg = unicoder(line.strip())
nzo.fail_msg = msg
msg = u'[%s] %s' % (unicoder(setname), msg)
nzo.set_unpack_info('Repair', msg)
nzo.status = Status.FAILED
elif 'There is not enough space on the disk' in line:
# Oops, disk is full!
@@ -1570,35 +1502,24 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, sin
p.wait()
except WindowsError, err:
if err[0] == '87' and not classic:
# Hit a bug in par2-tbb, retry with par2-classic
retry_classic = True
else:
raise WindowsError(err)
raise WindowsError(err)
logging.debug('PAR2 output was\n%s', '\n'.join(lines))
# If successful, add renamed files to the collection
if finished and renames:
previous = sabnzbd.load_data(RENAMES_FILE, nzo.workpath, remove=False)
for name in previous or {}:
renames[name] = previous[name]
sabnzbd.save_data(renames, RENAMES_FILE, nzo.workpath)
nzo.renamed_file(renames)
# If successful and files were reconstructed, remove incomplete original files
if finished and reconstructed:
# Use 'used_joinables' as a vehicle to get rid of the files
used_joinables.extend(reconstructed)
if retry_classic:
logging.debug('Retry PAR2-joining with par2-classic/cmdline')
return PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=True, single=single)
else:
return finished, readd, pars, datafiles, used_joinables, used_for_repair
return finished, readd, pars, datafiles, used_joinables, used_for_repair
_RE_FILENAME = re.compile(r'"([^"]+)"')
def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False, single=False):
def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
""" Run par2 on par-set """
parfolder = os.path.split(parfile)[0]
used_joinables = []
@@ -1606,7 +1527,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
# set the current nzo status to "Verifying...". Used in History
nzo.status = Status.VERIFYING
start = time()
start = time.time()
# Caching of verification implemented by adding:
# But not really required due to prospective-par2
@@ -1631,7 +1552,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
logging.info('Starting MultiPar: %s', command)
lines = []
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -1820,6 +1741,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
elif line.startswith('Finding available slice'):
# The actual scanning of the files
in_verify = True
nzo.set_action_line(T('Verifying'), T('Checking'))
elif in_verify:
m = _RE_FILENAME.search(line)
if m:
@@ -1930,20 +1852,20 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
# Result of verification
elif line.startswith('All Files Complete'):
# Completed without damage!
msg = T('[%s] Verified in %s, all files correct') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Verified in %s, all files correct') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Verified in %s, all files correct',
format_time_string(time() - start))
format_time_string(time.time() - start))
finished = 1
elif line.startswith(('Ready to repair', 'Ready to rejoin')):
# Ready to repair!
# Or we are re-joining a split file when there's no damage but takes time
msg = T('[%s] Verified in %s, repair is required') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Verified in %s, repair is required') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Verified in %s, repair is required',
format_time_string(time() - start))
start = time()
format_time_string(time.time() - start))
start = time.time()
# Set message for user in case of joining
if line.startswith('Ready to rejoin'):
@@ -1952,7 +1874,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
# ----------------- Repair stage
elif 'Recovering slice' in line:
# Before this it will calculate matrix, here is where it starts
start = time()
start = time.time()
in_repair = True
nzo.set_action_line(T('Repairing'), '%2d%%' % (0))
@@ -1970,9 +1892,9 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
nzo.status = Status.REPAIRING
elif line.startswith('Repaired successfully'):
msg = T('[%s] Repaired in %s') % (unicoder(setname), format_time_string(time() - start))
msg = T('[%s] Repaired in %s') % (unicoder(setname), format_time_string(time.time() - start))
nzo.set_unpack_info('Repair', msg)
logging.info('Repaired in %s', format_time_string(time() - start))
logging.info('Repaired in %s', format_time_string(time.time() - start))
finished = 1
elif in_verify_repaired and line.startswith('Repaired :'):
@@ -1990,15 +1912,12 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False
# Even if the repair did not complete fully it will rename those!
# But the ones in 'Finding available slices'-section will only be renamed after succesfull repair
if renames:
# Adding to the collection
previous = sabnzbd.load_data(RENAMES_FILE, nzo.workpath, remove=False)
for name in previous or {}:
renames[name] = previous[name]
sabnzbd.save_data(renames, RENAMES_FILE, nzo.workpath)
# If succes, we also remove the possibly previously renamed ones
if finished and previous:
reconstructed.extend(previous.values())
if finished:
reconstructed.extend(nzo.renames)
# Adding to the collection
nzo.renamed_file(renames)
# Remove renamed original files
workdir = os.path.split(parfile)[0]
@@ -2062,10 +1981,6 @@ def userxbit(filename):
def build_command(command):
""" Prepare list from running an external program """
for n in xrange(len(command)):
if isinstance(command[n], unicode):
command[n] = deunicode(command[n])
if not sabnzbd.WIN32:
if command[0].endswith('.py'):
with open(command[0], 'r') as script_file:
@@ -2255,10 +2170,8 @@ def QuickCheck(set, nzo):
# Save renames
if renames:
previous = sabnzbd.load_data(RENAMES_FILE, nzo.workpath, remove=False)
for name in previous or {}:
renames[name] = previous[name]
sabnzbd.save_data(renames, RENAMES_FILE, nzo.workpath)
nzo.renamed_file(renames)
return result
@@ -2389,7 +2302,7 @@ def pre_queue(name, pp, cat, script, priority, size, groups):
stup, need_shell, command, creationflags = build_command(command)
env = create_env()
logging.info('Running pre-queue script %s', command)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, env=env, creationflags=creationflags)
except:
@@ -2459,7 +2372,7 @@ class SevenZip(object):
command = [SEVEN_COMMAND, 'l', '-p', '-y', '-slt', self.path]
stup, need_shell, command, creationflags = build_command(command)
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
startupinfo=stup, creationflags=creationflags)
@@ -2486,7 +2399,7 @@ class SevenZip(object):
else:
stderr = open('/dev/null', 'w')
p = subprocess.Popen(command, shell=need_shell, stdin=subprocess.PIPE,
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=stderr,
startupinfo=stup, creationflags=creationflags)
@@ -2502,7 +2415,7 @@ class SevenZip(object):
def run_simple(cmd):
""" Run simple external command and return output """
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p = Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
txt = p.stdout.read()
p.wait()
return txt

View File

@@ -39,8 +39,9 @@ from sabnzbd.constants import NOTIFY_KEYS
from sabnzbd.misc import split_host, make_script_path
from sabnzbd.newsunpack import external_script
from gntp import GNTPRegister
from gntp.core import GNTPRegister
from gntp.notifier import GrowlNotifier
import gntp.errors
try:
import Growl
# Detect classic Growl (older than 1.3)
@@ -58,6 +59,7 @@ try:
except:
_HAVE_NTFOSD = False
##############################################################################
# Define translatable message table
##############################################################################
@@ -89,13 +91,9 @@ def get_icon():
if not os.path.isfile(icon):
icon = os.path.join(sabnzbd.DIR_PROG, 'sabnzbd.ico')
if os.path.isfile(icon):
if sabnzbd.WIN32 or sabnzbd.DARWIN:
fp = open(icon, 'rb')
icon = fp.read()
fp.close()
else:
# Due to a bug in GNTP, need this work-around for Linux/Unix
icon = 'http://sabnzbdplus.sourceforge.net/version/sabnzbd.ico'
fp = open(icon, 'rb')
icon = fp.read()
fp.close()
else:
icon = None
return icon
@@ -122,7 +120,7 @@ def check_classes(gtype, section):
def get_prio(gtype, section):
""" Check if `gtype` is enabled in `section` """
""" Check prio of `gtype` in `section` """
try:
return sabnzbd.config.get_config(section, '%s_prio_%s' % (section, gtype))()
except TypeError:
@@ -130,20 +128,32 @@ def get_prio(gtype, section):
return -1000
def send_notification(title, msg, gtype):
def check_cat(section, job_cat):
""" Check if `job_cat` is enabled in `section`. * = All """
if not job_cat:
return True
try:
section_cats = sabnzbd.config.get_config(section, '%s_cats' % section)()
return ('*' in section_cats or job_cat in section_cats)
except TypeError:
logging.debug('Incorrect Notify option %s:%s_cats', section, section)
return True
def send_notification(title, msg, gtype, job_cat=None):
""" Send Notification message """
# Notification Center
if sabnzbd.DARWIN and sabnzbd.cfg.ncenter_enable():
if check_classes(gtype, 'ncenter'):
if check_classes(gtype, 'ncenter') and check_cat('ncenter', job_cat):
send_notification_center(title, msg, gtype)
# Windows
if sabnzbd.WIN32 and sabnzbd.cfg.acenter_enable():
if check_classes(gtype, 'acenter'):
if check_classes(gtype, 'acenter') and check_cat('acenter', job_cat):
send_windows(title, msg, gtype)
# Growl
if sabnzbd.cfg.growl_enable() and check_classes(gtype, 'growl'):
if sabnzbd.cfg.growl_enable() and check_classes(gtype, 'growl') and check_cat('growl', job_cat):
if _HAVE_CLASSIC_GROWL and not sabnzbd.cfg.growl_server():
return send_local_growl(title, msg, gtype)
else:
@@ -151,32 +161,33 @@ def send_notification(title, msg, gtype):
time.sleep(0.5)
# Prowl
if sabnzbd.cfg.prowl_enable():
if sabnzbd.cfg.prowl_enable() and check_cat('prowl', job_cat):
if sabnzbd.cfg.prowl_apikey():
Thread(target=send_prowl, args=(title, msg, gtype)).start()
time.sleep(0.5)
# Pushover
if sabnzbd.cfg.pushover_enable():
if sabnzbd.cfg.pushover_enable() and check_cat('pushover', job_cat):
if sabnzbd.cfg.pushover_token():
Thread(target=send_pushover, args=(title, msg, gtype)).start()
time.sleep(0.5)
# Pushbullet
if sabnzbd.cfg.pushbullet_enable():
if sabnzbd.cfg.pushbullet_enable() and check_cat('pushbullet', job_cat):
if sabnzbd.cfg.pushbullet_apikey() and check_classes(gtype, 'pushbullet'):
Thread(target=send_pushbullet, args=(title, msg, gtype)).start()
time.sleep(0.5)
# Notification script.
if sabnzbd.cfg.nscript_enable():
if sabnzbd.cfg.nscript_enable() and check_cat('nscript', job_cat):
if sabnzbd.cfg.nscript_script():
Thread(target=send_nscript, args=(title, msg, gtype)).start()
time.sleep(0.5)
# NTFOSD
if have_ntfosd() and sabnzbd.cfg.ntfosd_enable() and check_classes(gtype, 'ntfosd'):
send_notify_osd(title, msg)
if have_ntfosd() and sabnzbd.cfg.ntfosd_enable():
if check_classes(gtype, 'ntfosd') and check_cat('ntfosd', job_cat):
send_notify_osd(title, msg)
def reset_growl():
@@ -193,6 +204,9 @@ def register_growl(growl_server, growl_password):
sys_name = hostname(host)
# Reduce logging of Growl in Debug/Info mode
logging.getLogger('gntp').setLevel(logging.WARNING)
# Clean up persistent data in GNTP to make re-registration work
GNTPRegister.notifications = []
GNTPRegister.headers = {}
@@ -216,7 +230,7 @@ def register_growl(growl_server, growl_password):
logging.debug(error)
del growler
ret = None
except socket.error, err:
except (gntp.errors.NetworkError, gntp.errors.AuthError) as err:
error = 'Cannot register with Growl %s' % str(err)
logging.debug(error)
del growler
@@ -270,7 +284,7 @@ def send_growl(title, msg, gtype, test=None):
else:
logging.debug('Growl error %s', ret)
return 'Growl error %s', ret
except socket.error, err:
except (gntp.errors.NetworkError, gntp.errors.AuthError) as err:
error = 'Growl error %s' % err
logging.debug(error)
return error

View File

@@ -111,6 +111,7 @@ class NzbQueue(object):
if sabnzbd.OLD_QUEUE and cfg.warned_old_queue() < QUEUE_VERSION:
logging.warning(T('Old queue detected, use Status->Repair to convert the queue'))
cfg.warned_old_queue.set(QUEUE_VERSION)
sabnzbd.config.save_config()
else:
# Try to process
try:
@@ -142,6 +143,7 @@ class NzbQueue(object):
# Done converting
cfg.converted_nzo_pickles.set(True)
sabnzbd.config.save_config()
nzo_ids = []
return nzo_ids
@@ -322,6 +324,8 @@ class NzbQueue(object):
nzo.set_pp(pp)
if explicit_priority is None:
self.set_priority(nzo_id, prio)
# Abort any ongoing unpacking if the category changed
nzo.abort_direct_unpacker()
result += 1
return result
@@ -329,6 +333,8 @@ class NzbQueue(object):
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
logging.info('Renaming %s to %s', nzo.final_name, name)
# Abort any ongoing unpacking if the name changed (dirs change)
nzo.abort_direct_unpacker()
if not nzo.futuretype:
nzo.set_final_name_pw(name, password)
else:
@@ -394,7 +400,7 @@ class NzbQueue(object):
self.save(nzo)
if not (quiet or nzo.status in ('Fetching',)):
notifier.send_notification(T('NZB added to queue'), nzo.filename, 'download')
notifier.send_notification(T('NZB added to queue'), nzo.filename, 'download', nzo.cat)
if not quiet and cfg.auto_sort():
self.sort_by_avg_age()
@@ -460,6 +466,7 @@ class NzbQueue(object):
if nzf:
removed.append(nzf_id)
nzo.abort_direct_unpacker()
post_done = nzo.remove_nzf(nzf)
if post_done:
if nzo.finished_files:
@@ -843,6 +850,8 @@ class NzbQueue(object):
return empty
def cleanup_nzo(self, nzo, keep_basic=False, del_files=False):
# Abort DirectUnpack and let it remove files
nzo.abort_direct_unpacker()
nzo.purge_data(keep_basic, del_files)
ArticleCache.do.purge_articles(nzo.saved_articles)

View File

@@ -60,15 +60,18 @@ SUBJECT_FN_MATCHER = re.compile(r'"([^"]*)"')
PROBABLY_PAR2_RE = re.compile(r'(.*)\.vol(\d*)[\+\-](\d*)\.par2', re.I)
REJECT_PAR2_RE = re.compile(r'\.par2\.\d+', re.I) # Reject duplicate par2 files
RE_NORMAL_NAME = re.compile(r'\.\w{2,5}$') # Test reasonably sized extension at the end
RE_QUICK_PAR2_CHECK = re.compile(r'\.par2\W*', re.I)
RE_RAR = re.compile(r'(\.rar|\.r\d\d|\.s\d\d|\.t\d\d|\.u\d\d|\.v\d\d)$', re.I)
##############################################################################
# Trylist
##############################################################################
TRYLIST_LOCK = threading.Lock()
class TryList(object):
""" TryList keeps track of which servers have been tried for a specific article
This used to have a Lock, but it's not needed (all atomic) and faster without
"""
# Pre-define attributes to save memory
__slots__ = ('__try_list', 'fetcher_priority')
@@ -79,16 +82,19 @@ class TryList(object):
def server_in_try_list(self, server):
""" Return whether specified server has been tried """
return server in self.__try_list
with TRYLIST_LOCK:
return server in self.__try_list
def add_to_try_list(self, server):
""" Register server as having been tried already """
self.__try_list.append(server)
with TRYLIST_LOCK:
if server not in self.__try_list:
self.__try_list.append(server)
def reset_try_list(self):
""" Clean the list """
self.__try_list = []
self.fetcher_priority = 0
with TRYLIST_LOCK:
self.__try_list = []
##############################################################################
@@ -209,10 +215,10 @@ class Article(TryList):
# NzbFile
##############################################################################
NzbFileSaver = (
'date', 'subject', 'filename', 'type', 'is_par2', 'vol', 'blocks', 'setname',
'extrapars', 'articles', 'decodetable', 'bytes', 'bytes_left',
'date', 'subject', 'filename', 'filename_checked', 'type', 'is_par2', 'vol',
'blocks', 'setname','extrapars', 'articles', 'decodetable', 'bytes', 'bytes_left',
'article_count', 'nzo', 'nzf_id', 'deleted', 'valid', 'import_finished',
'md5sum'
'md5sum', 'md5of16k'
)
@@ -229,6 +235,7 @@ class NzbFile(TryList):
self.subject = subject
self.type = None
self.filename = name_extractor(subject)
self.filename_checked = False
self.is_par2 = False
self.vol = None
@@ -250,6 +257,7 @@ class NzbFile(TryList):
self.import_finished = False
self.md5sum = None
self.md5of16k = None
self.valid = bool(article_db)
@@ -271,6 +279,13 @@ class NzbFile(TryList):
self.decodetable[partnum] = article
self.import_finished = True
else:
# TEMPORARY ERRORS
if not os.path.exists(os.path.join(self.nzf_id, self.nzo.workpath)):
logging.warning('Article DB file not found %s', self)
else:
# It was there, but empty
logging.warning('Article DB empty %s', self)
def remove_article(self, article, found):
""" Handle completed article, possibly end of file """
@@ -311,6 +326,11 @@ class NzbFile(TryList):
""" Is this file completed? """
return self.import_finished and not bool(self.articles)
@property
def lowest_partnum(self):
""" Get lowest article number of this file """
return min(self.decodetable)
def remove_admin(self):
""" Remove article database from disk (sabnzbd_nzf_<id>)"""
try:
@@ -439,7 +459,7 @@ class NzbParser(xml.sax.handler.ContentHandler):
if segm != self.article_db[partnum][0]:
msg = 'Duplicate part %s, but different ID-s (%s // %s)' % (partnum, self.article_db[partnum][0], segm)
logging.info(msg)
self.nzo.inc_log('dup_art_log', msg)
self.nzo.increase_bad_articles_counter('duplicate_articles')
else:
logging.info("Skipping duplicate article (%s)", segm)
else:
@@ -531,13 +551,13 @@ class NzbParser(xml.sax.handler.ContentHandler):
##############################################################################
NzbObjectSaver = (
'filename', 'work_name', 'final_name', 'created', 'bytes', 'bytes_downloaded', 'bytes_tried',
'repair', 'unpack', 'delete', 'script', 'cat', 'url', 'groups', 'avg_date',
'repair', 'unpack', 'delete', 'script', 'cat', 'url', 'groups', 'avg_date', 'md5of16k',
'partable', 'extrapars', 'md5packs', 'files', 'files_table', 'finished_files', 'status',
'avg_bps_freq', 'avg_bps_total', 'priority', 'dupe_table', 'saved_articles', 'nzo_id',
'futuretype', 'deleted', 'parsed', 'action_line', 'unpack_info', 'fail_msg', 'nzo_info',
'custom_name', 'password', 'next_save', 'save_timeout', 'encrypted',
'custom_name', 'password', 'next_save', 'save_timeout', 'encrypted', 'bad_articles',
'duplicate', 'oversized', 'precheck', 'incomplete', 'reuse', 'meta',
'md5sum', 'servercount', 'unwanted_ext', 'rating_filtered'
'md5sum', 'servercount', 'unwanted_ext', 'renames', 'rating_filtered'
)
# Lock to prevent errors when saving the NZO data
@@ -586,9 +606,11 @@ class NzbObject(TryList):
self.meta = {}
self.servercount = {} # Dict to keep bytes per server
self.created = False # dirprefixes + work_name created
self.direct_unpacker = None # Holds the DirectUnpacker instance
self.bytes = 0 # Original bytesize
self.bytes_downloaded = 0 # Downloaded byte
self.bytes_tried = 0 # Which bytes did we try
self.bad_articles = 0 # How many bad (non-recoverable) articles
self.repair = r # True if we want to repair this set
self.unpack = u # True if we want to unpack this set
self.delete = d # True if we want to delete this set
@@ -604,10 +626,12 @@ class NzbObject(TryList):
self.partable = {} # Holds one parfile-name for each set
self.extrapars = {} # Holds the extra parfile names for all sets
self.md5packs = {} # Holds the md5pack for each set
self.md5packs = {} # Holds the md5pack for each set (name: hash)
self.md5of16k = {} # Holds the md5s of the first-16k of all files in the NZB (hash: name)
self.files = [] # List of all NZFs
self.files_table = {} # Dictionary of NZFs indexed using NZF_ID
self.renames = {} # Dictionary of all renamed files
self.finished_files = [] # List of all finished NZFs
@@ -887,8 +911,8 @@ class NzbObject(TryList):
if not self.password and self.meta.get('password'):
self.password = self.meta.get('password', [None])[0]
# Set nzo save-delay to minimum 30 seconds
self.save_timeout = max(30, min(6.0 * float(self.bytes) / GIGI, 300.0))
# Set nzo save-delay to minimum 120 seconds
self.save_timeout = max(120, min(6.0 * float(self.bytes) / GIGI, 300.0))
# In case pre-queue script or duplicate check want to move
# to history we first need an nzo_id by entering the NzbQueue
@@ -955,9 +979,9 @@ class NzbObject(TryList):
head, vol, block = analyse_par2(name)
if head and matcher(lparset, head.lower()):
xnzf.set_par2(parset, vol, block)
self.extrapars[parset].append(xnzf)
# Don't postpone during pre-check or if all par2 should be kept
if not self.precheck and cfg.enable_par_cleanup():
# Don't postpone if all par2 should be kept
if cfg.enable_par_cleanup():
self.extrapars[parset].append(xnzf)
self.files.remove(xnzf)
@synchronized(NZO_LOCK)
@@ -970,7 +994,7 @@ class NzbObject(TryList):
if not nzf.is_par2:
head, vol, block = analyse_par2(fn)
# Is a par2file and repair mode activated
if head and (self.repair or cfg.allow_streaming()):
if head and self.repair:
# Skip if mini-par2 is not complete and there are more par2 files
if not block and nzf.bytes_left and self.extrapars.get(head):
return
@@ -1033,6 +1057,7 @@ class NzbObject(TryList):
if not found:
# Add extra parfiles when there was a damaged article and not pre-checking
if self.extrapars and not self.precheck:
self.abort_direct_unpacker()
self.prospective_add(nzf)
post_done = False
@@ -1063,6 +1088,7 @@ class NzbObject(TryList):
if name in files:
files.remove(name)
files.append(renames[name])
self.renames = renames
# Looking for the longest name first, minimizes the chance on a mismatch
files.sort(lambda x, y: len(y) - len(x))
@@ -1083,6 +1109,7 @@ class NzbObject(TryList):
nzfs.remove(nzf)
files.remove(filename)
self.bytes_tried += nzf.bytes
self.bytes_downloaded += nzf.bytes
break
try:
@@ -1113,6 +1140,9 @@ class NzbObject(TryList):
def set_pp(self, value):
self.repair, self.unpack, self.delete = sabnzbd.pp_to_opts(value)
logging.info('Set pp=%s for job %s', value, self.final_name)
# Abort unpacking if not desired anymore
if not self.unpack:
self.abort_direct_unpacker()
self.save_to_disk()
@property
@@ -1191,7 +1221,8 @@ class NzbObject(TryList):
@synchronized(NZO_LOCK)
def remove_parset(self, setname):
self.partable.pop(setname)
if setname in self.partable:
self.partable.pop(setname)
@synchronized(NZO_LOCK)
def remove_extrapar(self, parfile):
@@ -1202,19 +1233,10 @@ class NzbObject(TryList):
if self.partable and _set in self.partable and self.partable[_set] and parfile in self.partable[_set].extrapars:
self.partable[_set].extrapars.remove(parfile)
__re_quick_par2_check = re.compile(r'\.par2\W*', re.I)
@synchronized(NZO_LOCK)
def prospective_add(self, nzf):
""" Add par2 files to compensate for missing articles
"""
# How many do we need?
bad = len(self.nzo_info.get('bad_art_log', []))
miss = len(self.nzo_info.get('missing_art_log', []))
killed = len(self.nzo_info.get('killed_art_log', []))
dups = len(self.nzo_info.get('dup_art_log', []))
total_need = bad + miss + killed + dups
# How many do we already have?
blocks_already = 0
for nzf_check in self.files:
@@ -1222,14 +1244,17 @@ class NzbObject(TryList):
if nzf_check.blocks:
blocks_already = blocks_already + int_conv(nzf_check.blocks)
# Make sure to also select a parset if it was in the original filename
original_filename = self.renames.get(nzf.filename, '')
# Need more?
if not nzf.is_par2 and blocks_already < total_need:
if not nzf.is_par2 and blocks_already < self.bad_articles:
# We have to find the right par-set
for parset in self.extrapars.keys():
if parset in nzf.filename and self.extrapars[parset]:
if (parset in nzf.filename or parset in original_filename) and self.extrapars[parset]:
extrapars_sorted = sorted(self.extrapars[parset], key=lambda x: x.blocks, reverse=True)
# Loop until we have enough
while blocks_already < total_need and extrapars_sorted:
while blocks_already < self.bad_articles and extrapars_sorted:
new_nzf = extrapars_sorted.pop()
# Reset NZF TryList, in case something was on it before it became extrapar
new_nzf.reset_try_list()
@@ -1240,6 +1265,17 @@ class NzbObject(TryList):
# Reset NZO TryList
self.reset_try_list()
def add_to_direct_unpacker(self, nzf):
""" Start or add to DirectUnpacker """
if not self.direct_unpacker:
sabnzbd.directunpacker.DirectUnpacker(self)
self.direct_unpacker.add(nzf)
def abort_direct_unpacker(self):
""" Abort any running DirectUnpackers """
if self.direct_unpacker:
self.direct_unpacker.abort()
def check_quality(self, req_ratio=0):
""" Determine amount of articles present on servers
and return (gross available, nett) bytes
@@ -1252,7 +1288,7 @@ class NzbObject(TryList):
nzf = self.files_table[nzf_id]
if nzf.deleted:
short += nzf.bytes_left
if self.__re_quick_par2_check.search(nzf.subject):
if RE_QUICK_PAR2_CHECK.search(nzf.subject):
pars += nzf.bytes
anypars = True
else:
@@ -1284,19 +1320,19 @@ class NzbObject(TryList):
msg1 = T('Downloaded in %s at an average of %sB/s') % (complete_time, to_units(avg_bps * 1024, dec_limit=1))
msg1 += u'<br/>' + T('Age') + ': ' + calc_age(self.avg_date, True)
bad = self.nzo_info.get('bad_art_log', [])
miss = self.nzo_info.get('missing_art_log', [])
killed = self.nzo_info.get('killed_art_log', [])
dups = self.nzo_info.get('dup_art_log', [])
bad = self.nzo_info.get('bad_articles', 0)
miss = self.nzo_info.get('missing_articles', 0)
killed = self.nzo_info.get('killed_articles', 0)
dups = self.nzo_info.get('duplicate_articles', 0)
msg2 = msg3 = msg4 = msg5 = ''
if bad:
msg2 = (u'<br/>' + T('%s articles were malformed')) % len(bad)
msg2 = (u'<br/>' + T('%s articles were malformed')) % bad
if miss:
msg3 = (u'<br/>' + T('%s articles were missing')) % len(miss)
msg3 = (u'<br/>' + T('%s articles were missing')) % miss
if dups:
msg4 = (u'<br/>' + T('%s articles had non-matching duplicates')) % len(dups)
msg4 = (u'<br/>' + T('%s articles had non-matching duplicates')) % dups
if killed:
msg5 = (u'<br/>' + T('%s articles were removed')) % len(killed)
msg5 = (u'<br/>' + T('%s articles were removed')) % killed
msg = u''.join((msg1, msg2, msg3, msg4, msg5, ))
self.set_unpack_info('Download', msg, unique=True)
if self.url:
@@ -1307,12 +1343,12 @@ class NzbObject(TryList):
self.set_unpack_info('Servers', ', '.join(msgs), unique=True)
@synchronized(NZO_LOCK)
def inc_log(self, log, txt):
""" Append string txt to nzo_info element "log" """
try:
self.nzo_info[log].append(txt)
except:
self.nzo_info[log] = [txt]
def increase_bad_articles_counter(self, type):
""" Record information about bad articles """
if type not in self.nzo_info:
self.nzo_info[type] = 0
self.nzo_info[type] += 1
self.bad_articles += 1
def server_allowed(self, server):
if not server.categories:
@@ -1432,6 +1468,17 @@ class NzbObject(TryList):
self.files[pos + 1] = nzf
self.files[pos] = tmp_nzf
@synchronized(NZO_LOCK)
def renamed_file(self, name_set, old_name=None):
""" Save renames at various stages (Download/PP)
to be used on Retry. Accepts strings and dicts.
"""
if not old_name:
# Add to dict
self.renames.update(name_set)
else:
self.renames[name_set] = old_name
# Determine if rating information (including site identifier so rating can be updated)
# is present in metadata and if so store it
@synchronized(NZO_LOCK)
@@ -1491,6 +1538,8 @@ class NzbObject(TryList):
if keep_basic:
remove_all(wpath, 'SABnzbd_nz?_*', keep_folder=True)
remove_all(wpath, 'SABnzbd_article_*', keep_folder=True)
# We save the renames file
sabnzbd.save_data(self.renames, RENAMES_FILE, self.workpath)
else:
remove_all(wpath, recursive=True)
if del_files:
@@ -1527,9 +1576,9 @@ class NzbObject(TryList):
self.files if full else [],
queued_files,
self.status, self.priority,
len(self.nzo_info.get('missing_art_log', [])),
self.nzo_info.get('missing_articles', 0),
self.bytes_tried - self.bytes_downloaded,
)
self.direct_unpacker.get_formatted_stats() if self.direct_unpacker else 0)
def get_nzf_by_id(self, nzf_id):
if nzf_id in self.files_table:
@@ -1646,10 +1695,17 @@ class NzbObject(TryList):
self.avg_stamp = time.mktime(self.avg_date.timetuple())
self.wait = None
self.to_be_removed = False
self.direct_unpacker = None
if self.meta is None:
self.meta = {}
if self.servercount is None:
self.servercount = {}
if self.md5of16k is None:
self.md5of16k = {}
if self.renames is None:
self.renames = {}
if self.bad_articles is None:
self.bad_articles = 0
if self.bytes_tried is None:
# Fill with old info
self.bytes_tried = 0
@@ -1707,8 +1763,6 @@ def nzf_cmp_name(nzf1, nzf2, name=True):
if name:
# Prioritize .rar files above any other type of file (other than vol-par)
# Useful for nzb streaming
RE_RAR = re.compile(r'(\.rar|\.r\d\d|\.s\d\d|\.t\d\d|\.u\d\d|\.v\d\d)$', re.I)
m1 = RE_RAR.search(name1)
m2 = RE_RAR.search(name2)
if m1 and not (is_par2 or m2):

View File

@@ -59,25 +59,18 @@ class PostProcessor(Thread):
""" PostProcessor thread, designed as Singleton """
do = None # Link to instance of the thread
def __init__(self, queue=None, history_queue=None):
""" Initialize, optionally passing existing queue """
def __init__(self):
""" Initialize PostProcessor thread """
Thread.__init__(self)
# This history queue is simply used to log what active items to display in the web_ui
if history_queue:
self.history_queue = history_queue
else:
self.load()
self.load()
if self.history_queue is None:
self.history_queue = []
if queue:
self.queue = queue
else:
self.queue = Queue.Queue()
for nzo in self.history_queue:
self.process(nzo)
self.queue = Queue.Queue()
for nzo in self.history_queue:
self.process(nzo)
self.__stop = False
self.paused = False
PostProcessor.do = self
@@ -144,8 +137,10 @@ class PostProcessor(Thread):
def cancel_pp(self, nzo_id):
""" Change the status, so that the PP is canceled """
for nzo in self.history_queue:
if nzo.nzo_id == nzo_id and nzo.pp_active:
nzo.pp_active = False
if nzo.nzo_id == nzo_id:
nzo.abort_direct_unpacker()
if nzo.pp_active:
nzo.pp_active = False
return True
return None
@@ -257,13 +252,6 @@ def process_job(nzo):
# Get the NZB name
filename = nzo.final_name
if cfg.allow_streaming() and not (flag_repair or flag_unpack or flag_delete):
# After streaming, force +D
nzo.set_pp(3)
nzo.status = Status.FAILED
nzo.save_attribs()
all_ok = False
if nzo.fail_msg: # Special case: aborted due to too many missing data
nzo.status = Status.FAILED
nzo.save_attribs()
@@ -272,7 +260,6 @@ def process_job(nzo):
unpack_error = 1
try:
# Get the folder containing the download result
workdir = nzo.downpath
tmp_workdir_complete = None
@@ -308,11 +295,10 @@ def process_job(nzo):
logging.info('Starting Post-Processing on %s' +
' => Repair:%s, Unpack:%s, Delete:%s, Script:%s, Cat:%s',
filename, flag_repair, flag_unpack, flag_delete, script, cat)
filename, flag_repair, flag_unpack, flag_delete, script, nzo.cat)
# Set complete dir to workdir in case we need to abort
workdir_complete = workdir
dirname = nzo.final_name
marker_file = None
# Par processing, if enabled
@@ -331,45 +317,15 @@ def process_job(nzo):
all_ok = all_ok and not par_error
if all_ok:
# Fix encodings
fix_unix_encoding(workdir)
one_folder = False
# Determine class directory
catdir = config.get_categories(cat).dir()
if catdir.endswith('*'):
catdir = catdir.strip('*')
one_folder = True
complete_dir = real_path(cfg.complete_dir.get_path(), catdir)
complete_dir = long_path(complete_dir)
# TV/Movie/Date Renaming code part 1 - detect and construct paths
if cfg.enable_meta():
file_sorter = Sorter(nzo, cat)
# Use dirs generated by direct-unpacker
if nzo.direct_unpacker and nzo.direct_unpacker.unpack_dir_info:
tmp_workdir_complete, workdir_complete, file_sorter, one_folder, marker_file = nzo.direct_unpacker.unpack_dir_info
else:
file_sorter = Sorter(None, cat)
complete_dir = file_sorter.detect(dirname, complete_dir)
if file_sorter.sort_file:
one_folder = False
complete_dir = sanitize_and_trim_path(complete_dir)
if one_folder:
workdir_complete = create_dirs(complete_dir)
else:
workdir_complete = get_unique_path(os.path.join(complete_dir, dirname), create_dir=True)
marker_file = set_marker(workdir_complete)
if not workdir_complete or not os.path.exists(workdir_complete):
crash_msg = T('Cannot create final folder %s') % unicoder(os.path.join(complete_dir, dirname))
raise IOError
if cfg.folder_rename() and not one_folder:
tmp_workdir_complete = prefix(workdir_complete, '_UNPACK_')
try:
renamer(workdir_complete, tmp_workdir_complete)
except:
pass # On failure, just use the original name
else:
tmp_workdir_complete = workdir_complete
# Generate extraction path
tmp_workdir_complete, workdir_complete, file_sorter, one_folder, marker_file = prepare_extraction_path(nzo)
newfiles = []
# Run Stage 2: Unpack
@@ -420,7 +376,7 @@ def process_job(nzo):
# Check if this is an NZB-only download, if so redirect to queue
# except when PP was Download-only
if flag_repair:
nzb_list = nzb_redirect(tmp_workdir_complete, nzo.final_name, nzo.pp, script, cat, priority=nzo.priority)
nzb_list = nzb_redirect(tmp_workdir_complete, nzo.final_name, nzo.pp, script, nzo.cat, priority=nzo.priority)
else:
nzb_list = None
if nzb_list:
@@ -474,7 +430,7 @@ def process_job(nzo):
nzo.set_action_line(T('Running script'), unicoder(script))
nzo.set_unpack_info('Script', T('Running user script %s') % unicoder(script), unique=True)
script_log, script_ret = external_processing(script_path, nzo, clip_path(workdir_complete),
dirname, job_result)
nzo.final_name, job_result)
script_line = get_last_line(script_log)
if script_log:
script_output = nzo.nzo_id
@@ -498,7 +454,7 @@ def process_job(nzo):
# Email the results
if (not nzb_list) and cfg.email_endjob():
if (cfg.email_endjob() == 1) or (cfg.email_endjob() == 2 and (unpack_error or par_error or script_error)):
emailer.endjob(dirname, cat, all_ok, workdir_complete, nzo.bytes_downloaded,
emailer.endjob(nzo.final_name, nzo.cat, all_ok, workdir_complete, nzo.bytes_downloaded,
nzo.fail_msg, nzo.unpack_info, script, TRANS(script_log), script_ret)
if script_output:
@@ -539,12 +495,12 @@ def process_job(nzo):
logging.info("Traceback: ", exc_info=True)
crash_msg = T('see logfile')
nzo.fail_msg = T('PostProcessing was aborted (%s)') % unicoder(crash_msg)
notifier.send_notification(T('Download Failed'), filename, 'failed')
notifier.send_notification(T('Download Failed'), filename, 'failed', nzo.cat)
nzo.status = Status.FAILED
par_error = True
all_ok = False
if cfg.email_endjob():
emailer.endjob(dirname, cat, all_ok, clip_path(workdir_complete), nzo.bytes_downloaded,
emailer.endjob(nzo.final_name, nzo.cat, all_ok, clip_path(workdir_complete), nzo.bytes_downloaded,
nzo.fail_msg, nzo.unpack_info, '', '', 0)
if all_ok:
@@ -577,10 +533,10 @@ def process_job(nzo):
# Show final status in history
if all_ok:
notifier.send_notification(T('Download Completed'), filename, 'complete')
notifier.send_notification(T('Download Completed'), filename, 'complete', nzo.cat)
nzo.status = Status.COMPLETED
else:
notifier.send_notification(T('Download Failed'), filename, 'failed')
notifier.send_notification(T('Download Failed'), filename, 'failed', nzo.cat)
nzo.status = Status.FAILED
# Log the overall time taken for postprocessing
@@ -597,6 +553,57 @@ def process_job(nzo):
return True
def prepare_extraction_path(nzo):
""" Based on the information that we have, generate
the extraction path and create the directory.
Seperated so it can be called from DirectUnpacker
"""
one_folder = False
marker_file = None
# Determine class directory
catdir = config.get_categories(nzo.cat).dir()
if catdir.endswith('*'):
catdir = catdir.strip('*')
one_folder = True
complete_dir = real_path(cfg.complete_dir.get_path(), catdir)
complete_dir = long_path(complete_dir)
# TV/Movie/Date Renaming code part 1 - detect and construct paths
file_sorter = Sorter(nzo, nzo.cat)
complete_dir = file_sorter.detect(nzo.final_name, complete_dir)
if file_sorter.sort_file:
one_folder = False
complete_dir = sanitize_and_trim_path(complete_dir)
if one_folder:
workdir_complete = create_dirs(complete_dir)
else:
workdir_complete = get_unique_path(os.path.join(complete_dir, nzo.final_name), create_dir=True)
marker_file = set_marker(workdir_complete)
if not workdir_complete or not os.path.exists(workdir_complete):
logging.error(T('Cannot create final folder %s') % unicoder(os.path.join(complete_dir, nzo.final_name)))
raise IOError
if cfg.folder_rename() and not one_folder:
prefixed_path = prefix(workdir_complete, '_UNPACK_')
tmp_workdir_complete = get_unique_path(prefix(workdir_complete, '_UNPACK_'), create_dir=False)
try:
renamer(workdir_complete, tmp_workdir_complete)
except:
pass # On failure, just use the original name
# Is the unique path different? Then we also need to modify the final path
if prefixed_path != tmp_workdir_complete:
workdir_complete = workdir_complete + os.path.splitext(tmp_workdir_complete)[1]
else:
tmp_workdir_complete = workdir_complete
return tmp_workdir_complete, workdir_complete, file_sorter, one_folder, marker_file
def is_parfile(fn):
""" Check quickly whether file has par2 signature """
PAR_ID = "PAR2\x00PKT"
@@ -612,7 +619,7 @@ def is_parfile(fn):
def parring(nzo, workdir):
""" Perform par processing. Returns: (par_error, re_add) """
filename = nzo.final_name
notifier.send_notification(T('Post-processing'), filename, 'pp')
notifier.send_notification(T('Post-processing'), filename, 'pp', nzo.cat)
logging.info('Starting verification and repair of %s', filename)
# Get verification status of sets

View File

@@ -453,6 +453,8 @@ SKIN_TEXT = {
'explain-auto_disconnect' : TT('Disconnect from Usenet server(s) when queue is empty or paused.'),
'opt-auto_sort' : TT('Sort by Age'),
'explain-auto_sort' : TT('Automatically sort items by (average) age.'),
'opt-direct_unpack' : TT('Direct Unpack'),
'explain-direct_unpack' : TT('Jobs will start unpacking during the downloading to reduce post-processing time. Only works for jobs that do not need repair.'),
'opt-propagation_delay' : TT('Propagation delay'),
'explain-propagation_delay' : TT('Posts will be paused untill they are at least this age. Setting job priority to Force will skip the delay.'),
'opt-check_new_rel' : TT('Check for New Release'),
@@ -805,6 +807,8 @@ SKIN_TEXT = {
'Glitter-noSelect' : TT('Nothing selected!'),
'Glitter-removeSelected' : TT('Remove all selected files'),
'Glitter-toggleCompletedFiles' : TT('Hide/show completed files'),
'Glitter-top' : TT('Top'),
'Glitter-bottom' : TT('Bottom'),
'Glitter-retryJob' : TT('Retry'),
'Glitter-more' : TT('More'),
'Glitter-scriptLog' : TT('View Script Log'),

View File

@@ -30,6 +30,7 @@ import sabnzbd
from sabnzbd.misc import move_to_path, cleanup_empty_directories, get_unique_path, \
get_unique_filename, get_ext, renamer, sanitize_foldername, clip_path
from sabnzbd.constants import series_match, date_match, year_match, sample_match
from sabnzbd.encoding import unicoder
import sabnzbd.cfg as cfg
RE_SAMPLE = re.compile(sample_match, re.I)
@@ -895,7 +896,7 @@ def path_subst(path, mapping):
break
newpath.append(result)
n += 1
return ''.join(newpath)
return u''.join([unicoder(x) for x in newpath])
def get_titles(nzo, match, name, titleing=False):

View File

@@ -73,9 +73,6 @@ class URLGrabber(Thread):
self.shutdown = False
while not self.shutdown:
# Don't pound the website!
time.sleep(5.0)
(url, future_nzo) = self.queue.get()
if not url:
@@ -361,7 +358,7 @@ def bad_fetch(nzo, url, msg='', content=False):
nzo.fail_msg = msg
notifier.send_notification(T('URL Fetching failed; %s') % '', '%s\n%s' % (msg, url), 'other')
notifier.send_notification(T('URL Fetching failed; %s') % '', '%s\n%s' % (msg, url), 'other', nzo.cat)
if cfg.email_endjob() > 0:
emailer.badfetch_mail(msg, url)

View File

@@ -0,0 +1,159 @@
## Fixing python 2.7 windows unicode issue with ``subprocess.Popen``.
## Copied from
## http://vaab.blog.kal.fr/2017/03/16/fixing-windows-python-2-7-unicode-issue-with-subprocesss-popen/
## https://gist.github.com/vaab/2ad7051fc193167f15f85ef573e54eb9
## issue: https://bugs.python.org/issue19264
import os
import ctypes
import subprocess
import _subprocess
from ctypes import byref, windll, c_char_p, c_wchar_p, c_void_p, \
Structure, sizeof, c_wchar, WinError
from ctypes.wintypes import BYTE, WORD, LPWSTR, BOOL, DWORD, LPVOID, \
HANDLE
##
## Types
##
CREATE_UNICODE_ENVIRONMENT = 0x00000400
LPCTSTR = c_char_p
LPTSTR = c_wchar_p
LPSECURITY_ATTRIBUTES = c_void_p
LPBYTE = ctypes.POINTER(BYTE)
class STARTUPINFOW(Structure):
_fields_ = [
("cb", DWORD), ("lpReserved", LPWSTR),
("lpDesktop", LPWSTR), ("lpTitle", LPWSTR),
("dwX", DWORD), ("dwY", DWORD),
("dwXSize", DWORD), ("dwYSize", DWORD),
("dwXCountChars", DWORD), ("dwYCountChars", DWORD),
("dwFillAtrribute", DWORD), ("dwFlags", DWORD),
("wShowWindow", WORD), ("cbReserved2", WORD),
("lpReserved2", LPBYTE), ("hStdInput", HANDLE),
("hStdOutput", HANDLE), ("hStdError", HANDLE),
]
LPSTARTUPINFOW = ctypes.POINTER(STARTUPINFOW)
class PROCESS_INFORMATION(Structure):
_fields_ = [
("hProcess", HANDLE), ("hThread", HANDLE),
("dwProcessId", DWORD), ("dwThreadId", DWORD),
]
LPPROCESS_INFORMATION = ctypes.POINTER(PROCESS_INFORMATION)
class DUMMY_HANDLE(ctypes.c_void_p):
def __init__(self, *a, **kw):
super(DUMMY_HANDLE, self).__init__(*a, **kw)
self.closed = False
def Close(self):
if not self.closed:
windll.kernel32.CloseHandle(self)
self.closed = True
def __int__(self):
return self.value
CreateProcessW = windll.kernel32.CreateProcessW
CreateProcessW.argtypes = [
LPCTSTR, LPTSTR, LPSECURITY_ATTRIBUTES,
LPSECURITY_ATTRIBUTES, BOOL, DWORD, LPVOID, LPCTSTR,
LPSTARTUPINFOW, LPPROCESS_INFORMATION,
]
CreateProcessW.restype = BOOL
##
## Patched functions/classes
##
def CreateProcess(executable, args, _p_attr, _t_attr,
inherit_handles, creation_flags, env, cwd,
startup_info):
"""Create a process supporting unicode executable and args for win32
Python implementation of CreateProcess using CreateProcessW for Win32
"""
si = STARTUPINFOW(
dwFlags=startup_info.dwFlags,
wShowWindow=startup_info.wShowWindow,
cb=sizeof(STARTUPINFOW),
## XXXvlab: not sure of the casting here to ints.
hStdInput=int(startup_info.hStdInput),
hStdOutput=int(startup_info.hStdOutput),
hStdError=int(startup_info.hStdError),
)
wenv = None
if env is not None:
## LPCWSTR seems to be c_wchar_p, so let's say CWSTR is c_wchar
env = (unicode("").join([
unicode("%s=%s\0") % (k, v)
for k, v in env.items()])) + unicode("\0")
wenv = (c_wchar * len(env))()
wenv.value = env
pi = PROCESS_INFORMATION()
creation_flags |= CREATE_UNICODE_ENVIRONMENT
if CreateProcessW(executable, args, None, None,
inherit_handles, creation_flags,
wenv, cwd, byref(si), byref(pi)):
return (DUMMY_HANDLE(pi.hProcess), DUMMY_HANDLE(pi.hThread),
pi.dwProcessId, pi.dwThreadId)
raise WinError()
class Popen(subprocess.Popen):
"""This superseeds Popen and corrects a bug in cPython 2.7 implem"""
def _execute_child(self, args, executable, preexec_fn, close_fds,
cwd, env, universal_newlines,
startupinfo, creationflags, shell, to_close,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite):
"""Code from part of _execute_child from Python 2.7 (9fbb65e)
There are only 2 little changes concerning the construction of
the the final string in shell mode: we preempt the creation of
the command string when shell is True, because original function
will try to encode unicode args which we want to avoid to be able to
sending it as-is to ``CreateProcess``.
"""
if not isinstance(args, subprocess.types.StringTypes):
args = subprocess.list2cmdline(args)
if startupinfo is None:
startupinfo = subprocess.STARTUPINFO()
if shell:
startupinfo.dwFlags |= _subprocess.STARTF_USESHOWWINDOW
startupinfo.wShowWindow = _subprocess.SW_HIDE
comspec = os.environ.get("COMSPEC", unicode("cmd.exe"))
args = unicode('{} /c "{}"').format(comspec, args)
if (_subprocess.GetVersion() >= 0x80000000 or
os.path.basename(comspec).lower() == "command.com"):
w9xpopen = self._find_w9xpopen()
args = unicode('"%s" %s') % (w9xpopen, args)
creationflags |= _subprocess.CREATE_NEW_CONSOLE
super(Popen, self)._execute_child(args, executable,
preexec_fn, close_fds, cwd, env, universal_newlines,
startupinfo, creationflags, False, to_close, p2cread,
p2cwrite, c2pread, c2pwrite, errread, errwrite)
_subprocess.CreateProcess = CreateProcess

View File

@@ -1,336 +0,0 @@
par2cmdline is a PAR 2.0 compatible file verification and repair tool.
To see the ongoing development see
https://github.com/BlackIkeEagle/par2cmdline
The original development was done on Sourceforge but stalled.
For more information from the original authors see
http://parchive.sourceforge.net
Also for details of the PAR 2.0 specification and discussion of all
things PAR.
WHAT EXACTLY IS PAR2CMDLINE?
par2cmdline is a program for creating and using PAR2 files to detect
damage in data files and repair them if necessary. It can be used with
any kind of file.
WHY IS PAR 2.0 better than PAR 1.0?
* It is not necessary to split a single large file into many equally
size small files (although you can still do so if you wish).
* There is no loss of efficiency when operating on multiple files
of different sizes.
* It is possible to repair damaged files (using exactly the amount of
recovery data that corresponds to the amount of damage), rather than
requiring the complete reconstruction of the damaged file.
* Recovery files may be of different sizes making it possible to
obtain exactly the amount of recovery data required to carry out
a repair.
* Because damaged data files are still useable during the recovery
process, less recovery data is required to achieve a successful
repair. It is therefore not necessary to create as much recovery
data in the first place to achieve the same level of protection.
* You can protect up to 32768 files rather than the 256 that PAR 1.0
is limited to.
* Damaged or incomplete recovery files can also be used during the
recovery process in the same way that damaged data files can.
* PAR 2.0 requires less recovery data to provide the same level of
protection from damage compared with PAR 1.0.
DOES PAR 2.0 HAVE ANY DISADVANTAGES?
Yes, there is one disadvantage:
* All PAR 2.0 program will take somewhat longer to create recovery
files than a PAR 1.0 program does.
This disadvantage is considerably mitigated by the fact that you don't
need to create as much recovery data in the first place to provide the
same level of protection against loss and damage.
COMPILING PAR2CMDLINE
You should have received par2cmdline in the form of source code which
you can compile on your computer. You may optionally have received a
precompiled version of the program for your operating system.
If you have only downloaded a precompiled executable, then the source
code should be available from the same location where you downloaded the
executable from.
If you have MS Visual Studio .NET, then just open the par2cmdline.sln
file and compile. You should then copy par2cmdline.exe to an appropriate
location that is on your path.
To compile on Linux and other Unix variants use the following commands:
aclocal
automake --add-missing
autoconf
./configure
make
make check
make install
See INSTALL for full details on how to use the "configure" script.
USING PAR2CMDLINE
The command line parameters for par2cmdline are as follow:
par2 -h : show this help
par2 -V : show version
par2 -VV : show version and copyright
par2 c(reate) [options] <par2 file> [files]
par2 v(erify) [options] <par2 file> [files]
par2 r(epair) [options] <par2 file> [files]
Also:
par2create [options] <par2 file> [files]
par2verify [options] <par2 file> [files]
par2repair [options] <par2 file> [files]
Options:
-a<file> : Set the main par2 archive name
required on create, optional for verify and repair
-b<n> : Set the Block-Count
-s<n> : Set the Block-Size (Don't use both -b and -s)
-r<n> : Level of Redundancy (%)
-r<c><n> : Redundancy target size, <c>=g(iga),m(ega),k(ilo) bytes
-c<n> : Recovery block count (don't use both -r and -c)
-f<n> : First Recovery-Block-Number
-u : Uniform recovery file sizes
-l : Limit size of recovery files (Don't use both -u and -l)
-n<n> : Number of recovery files (Don't use both -n and -l)
-m<n> : Memory (in MB) to use
-v [-v] : Be more verbose
-q [-q] : Be more quiet (-qq gives silence)
-p : Purge backup files and par files on successful recovery or
when no recovery is needed
-R : Recurse into subdirectories (only useful on create)
-N : No data skipping (find badly misspositioned data blocks)
-S<n> : Skip leaway (distance +/- from expected block position)
-- : Treat all remaining CommandLine as filenames
If you wish to create par2 files for a single source file, you may leave
out the name of the par2 file from the command line. par2cmdline will then
assume that you wish to base the filenames for the par2 files on the name
of the source file.
You may also leave off the .par2 file extension when verifying and repairing.
CREATING PAR2 FILES
With PAR 2.0 you can create PAR2 recovery files for as few as 1 or as many as
32768 files. If you wanted to create PAR1 recovery files for a single file
you were forced to split the file into muliple parts and RAR was frequently
used for this purpose. You do NOT need to split files with PAR 2.0.
To create PAR 2 recovery files for a single data file (e.g. one called
test.mpg), you can use the following command:
par2 create test.mpg.par2 test.mpg
If test.mpg is an 800 MB file, then this will create a total of 8 PAR2 files
with the following filenames (taking roughly 6 minutes on a PC with a
1500MHz CPU):
test.mpg.par2 - This is an index file for verification only
test.mpg.vol00+01.par2 - Recovery file with 1 recovery block
test.mpg.vol01+02.par2 - Recovery file with 2 recovery blocks
test.mpg.vol03+04.par2 - Recovery file with 4 recovery blocks
test.mpg.vol07+08.par2 - Recovery file with 8 recovery blocks
test.mpg.vol15+16.par2 - Recovery file with 16 recovery blocks
test.mpg.vol31+32.par2 - Recovery file with 32 recovery blocks
test.mpg.vol63+37.par2 - Recovery file with 37 recovery blocks
The test.mpg.par2 file is 39 KB in size and the other files vary in size from
443 KB to 15 MB.
These par2 files will enable the recovery of up to 100 errors totalling 40 MB
of lost or damaged data from the original test.mpg file when it and the par2
files are posted on UseNet.
When posting on UseNet it is recommended that you use the "-s" option to set
a blocksize that is equal to the Article size that you will use to post the
data file. If you wanted to post the test.mpg file using an article size
of 300 KB then the command you would type is:
par2 create -s307200 test.mpg.par2 test.mpg
This will create 9 PAR2 files instead of 8, and they will be capable of
correcting up to 134 errors totalling 40 MB. It will take roughly 8 minutes
to create the recovery files this time.
In both of these two examples, the total quantity of recovery data created
was 40 MB (which is 5% of 800 MB). If you wish to create a greater or lesser
quantity of recovery data, you can use the "-r" option.
To create 10% recovery data instead of the default of 5% and also to use a
block size of 300 KB, you would use the following command:
par2 create -s307200 -r10 test.mpg.par2 test.mpg
This would also create 9 PAR2 files, but they would be able to correct up to
269 errors totalling 80 MB. Since twice as much recovery data is created, it
will take about 16 minutes to do so with a 1500MHz CPU.
The "-u" and "-n" options can be used to control exactly how many recovery
files are created and how the recovery blocks are distributed among them.
They do not affect the total quantity of recovery data created.
The "-f" option is used when you create additional recovery data e.g. If
you have already created 10% and want another 5% then you migh use the
following command:
par2 create -s307200 -r5 -f300 test.mpg.par2 test.mpg
This specifies the same block size (which is a requirement for additional
recovery files), 5% recovery data, and a first block number of 300.
The "-m" option controls how much memory par2cmdline uses. It defaults to
16 MB unless you override it.
When creating PAR2 recovery files you might want to fill up a "medium" like a
DVD or a Blu-Ray. Therefore we can set the target size of the recovery files by
issuing the following command:
par2 create -rm200 recovery.par2 *
It makes no sense to set a insanely high recovery size. The command will make
that the resulting sum of the par2 files approaches the requested size. It is
an estimate so don't go to crazy.
CREATING PAR2 FILES FOR MULTIPLE DATA FILES
When creating PAR2 recovery files from multiple data files, you must specify
the base filename to use for the par2 files and the names of all of the data
files.
If test.mpg had been split into multiple RAR files, then you could use:
par2 create test.mpg.rar.par2 test.mpg.part*.rar
The files filename "test.mpg.rar.par2" says what you want the par2 files to
be called and "test.mpg.part*.rar" should select all of the RAR files.
VERIFYING AND REPAIRING
When using par2 recovery files to verify or repair the data files from
which they were created, you only need to specify the filename of one
of the par2 files to par2cmdline.
e.g.:
par2 verify test.mpg.par2
This tells par2cmdline to use the information in test.mpg.par2 to verify the
data files.
par2cmdline will automatically search for the other par2 files that were
created and use the information they contain to determine the filenames
of the original data files and then to verify them.
If all of the data files are OK, then par2cmdline will report that repair
will not be required.
If any of the data files are missing or damaged, par2cmdline will report
the details of what it has found. If the recovery files contain enough
recovery blocks to repair the damage, you will be told that repair is
possible. Otherwise you will be told exactly how many recovery blocks
will be required in order to repair.
To carry out a repair use the following command:
par2 repair test.mpg.par2
This tells par2cmdline to verify and if possible repair any damaged or
missing files. If a repair is carried out, then each file which is
repaired will be re-verified to confirm that the repair was successful.
MISNAMED AND INCOMPLETE DATA FILES
If any of the recovery files or data files have the wrong filename, then
par2cmdline will not automatically find and scan them.
To have par2cmdline scan such files, you must include them on the command
line when attempting to verify or repair.
e.g.:
par2 r test.mpg.par2 other.mpg
This tells par2cmdline to scan the file called other.mpg to see if it
contains any data belonging to the original data files.
If one of the extra files specified in this way is an exact match
for a data file, then the repair process will rename the file so that
it has the correct filename.
Because par2cmdline is designed to be able to find good data within a
damaged file, it can do the same with incomplete files downloaded from
UseNet. If some of the articles for a file are missing, you should still
download the file and save it to disk for par2cmdline to scan. If you
do this then you may find that you can carry out a repair in a situation
where you would not otherwise have sufficient recovery data.
You can have par2cmdline scan all files that are in the current directory
using a command such as:
par2 r test.mpg.par2 *
WHAT TO DO WHEN YOU ARE TOLD YOU NEED MORE RECOVERY BLOCKS
If par2cmdline determines that any of the data files are damaged or
missing and finds that there is insufficient recovery data to effect
a repair, you will be told that you need a certain number of recovery
blocks. You can obtain these by downloading additional recovery files.
In order to make things easy, par2 files have filenames that tell you
exactly how many recovery blocks each one contains.
Assuming that the following command was used to create recovery data:
par2 c -b1000 -r5 test.mpg
Then the recovery files that are created would be called:
test.mpg.par2
test.mpg.vol00+01.par2
test.mpg.vol01+02.par2
test.mpg.vol03+04.par2
test.mpg.vol07+08.par2
test.mpg.vol15+16.par2
test.mpg.vol31+19.par2
The first file in this list does not contain any recovery data, it only
contains information to verify the data files.
Each of the other files contains a different number of recovery blocks.
The number after the '+' sign is the number of recovery blocks and the
number preceding the '+' sign is the block number of the first recovery
block in that file.
If par2cmdline told you that you needed 10 recovery blocks, then you would
need "test.mpg.vol01+02.par2" and "test.mpg.vol07+08.par". You might of course
choose to fetch "test.mpg.vol15+16.par2" instead (in which case you would have
an extra 6 recovery blocks which would not be used for the repair).
REED SOLOMON CODING
PAR2 uses Reed Solomon Coding to perform its calculations. For details of this
coding technique try the following link:
``A Tutorial on Reed-Solomon Coding for Fault-Tolerance in RAID-like Systems''
<http://web.eecs.utk.edu/~plank/plank/papers/CS-96-332.html>

View File

Binary file not shown.

View File

@@ -1,340 +0,0 @@
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.
59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Library General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Library General
Public License instead of this License.

View File

@@ -1,885 +0,0 @@
=== Table of contents ===
--- Introduction ---
--- Installing the pre-built Windows version ---
--- Installing the pre-built Mac OS X version ---
--- Installing the pre-built Linux version ---
--- Building and installing on UNIX type systems ---
--- Building and installing on Mac OS X systems ---
--- Building and installing on Windows operating systems ---
--- Building and installing on FreeBSD ---
--- Technical Details ---
--- Version History ---
=== Table of contents ===
--- Introduction ---
This is a concurrent (multithreaded) version of par2cmdline 0.4, a utility to
create and repair data files using Reed Solomon coding. par2 parity archives
are commonly used on Usenet postings to allow corrupted postings to be
repaired instead of needing the original poster to repost the corrupted
file(s).
For more information about par2, go to this web site:
http://parchive.sourceforge.net/
The original version of par2cmdline 0.4 was downloaded from:
http://sourceforge.net/projects/parchive
This version has been modified to utilise the Intel Threading Building Blocks
library, which enables it to process files concurrently instead of the
original version's serial processing. Computers with more than one CPU or core
such as those using Intel Core Duo, Intel Core Duo 2, or AMD Athlon X2 CPUs
can now create or repair par2 archives much quicker than the original version.
For example, dual core machines can achieve near-double performance when
creating or repairing.
The Intel Threading Building Blocks library is obtained from:
http://osstbb.intel.com/
The licensing of this source code has not been modified: it is still published
under the GPLv2 (or later), and the COPYING file is included in this
distribution as per the GPL.
To download the source code or some operating system builds of the
concurrent version of par2cmdline 0.4, go to:
http://www.chuchusoft.com/par2_tbb
--- Installing the pre-built Windows version ---
The Windows version is distributed as an executable (par2.exe) which has
built into it (i.e., statically linked) the Intel Threading Building Blocks
4.3 Update 1 library, built from the tbb43_20141023oss_src.tgz distribution.
The Windows version is portable (can be run from a USB thumb drive) and does
not require a specific version of the C runtime library because the par2.exe
executable is built by statically linking with the C runtime library.
To install, copy the par2.exe file and then invoke it from the command line.
To uninstall, delete the par2.exe file along with any files from the
distribution folder.
--- Installing the pre-built Mac OS X version ---
The Mac version is an universal build of the concurrent version of par2cmdline 0.4
for Mac OS X 10.5. In other words, the par2 executable file contains both a 32-bit
x86 and a 64-bit x86_64 build of the par2 sources. It is also portable and can be
run from a USB thumb drive (no need to copy to the Mac's internal storage device).
It is distributed as an executable (par2) along with the required universal build
of the Intel Threading Building Blocks 4.3 Update 1 library (libtbb.dylib).
To install, place the par2 and libtbb.dylib files in a folder and
invoke them from the command line.
To uninstall, delete the par2 and libtbb.dylib files along with any
files from the distribution folder.
--- Installing the pre-built Linux version ---
The Linux versions are a 32-bit i386 and 64-bit x86_64 build of the
concurrent version of par2cmdline 0.4 for GNU/Linux kernel version 2.6
with GCC 4. It is distributed as an executable (par2) along with the
required Intel Threading Building Blocks 4.3 Update 1 (libtbb.so and
libtbb.so.2). There are separate distributions for the 32-bit and
64-bit versions. They are also portable and can be run from a USB thumb
drive (no need to copy to the computer's internal storage device).
To install, place the par2, libtbb.so and libtbb.so.2 files in a
folder and invoke them from the command line.
To uninstall, delete the par2, libtbb.so and libtbb.so.2 files along
with any files from the distribution folder.
--- Building and installing on UNIX type systems ---
For UNIX or similar systems, the included configure script should be used to
generate a makefile which is then built with a Make utility. Before using
them however, you may need to modify the configure scripts as detailed below.
Because this version depends on the Intel Threading Building Blocks library,
you will need to tell the build system where the headers and libraries are in
order to compile and link the program. There are 2 ways to do this: use the
tbbvars.sh script included in TBB to add the appropriate environment variables,
or manually modify the Makefile to use the appropriate paths. The tbbvars.sh
file is in the tbb<version>oss_src/build directory. To manually modify the
Makefile:
In `Makefile.am', for Darwin/Mac OS X, change the AM_CXXFLAGS line to:
AM_CXXFLAGS = -Wall -I../tbb43_20141023oss/include -gfull -O3 -fvisibility=hidden -fvisibility-inlines-hidden
or for other POSIX systems, change the AM_CXXFLAGS line to:
AM_CXXFLAGS = -Wall -I../tbb43_20141023oss/include
and modify the path to wherever your extracted Intel TBB files are. Note that it
should point at the `include' directory inside the main tbb directory.
For linking, the file `Makefile.am' has this line:
LDADD = -lstdc++ -ltbb -L.
thus the tbb library is already added to the list of libraries to link against.
You will need to have libtbb.a (or libtbb.dylib or libtbb.so etc.) in your
library path (usually /usr/lib).
Alternatively, if the TBB library is not in a standard library directory (or
on the linker's list of library paths) then add a library path so the linker
can link to the TBB:
LDADD = -lstdc++ -ltbb -L<directory>
For example:
LDADD = -lstdc++ -ltbb -L.
The Mac OS X distribution of this project is built using a relative-path
for the dynamic library. Please see the next section for more information.
The GNU/Linux distribution of this project is built using a relative-path
for the dynamic library (by passing the "-R $ORIGIN" option to the linker).
--- Building and installing on Mac OS X systems ---
The Mac version is an universal build of the concurrent version of par2cmdline 0.4
for Mac OS X 10.5. In other words, the par2 executable file contains both a 32-bit
x86 and a 64-bit x86_64 build of the par2 sources.
It is distributed as an executable (par2) along with the required Intel
Threading Building Blocks 4.2 library (libtbb.dylib). The libtbb.dylib file
is also universal (32-bit and 64-bit versions for x86/x86_64 are inside it).
The distributed version is built on a 10.6.8 system using the compiler toolchain
from Xcode 3.2.6: GCC 4.2. The target OS is 10.5 using the 10.5 SDK.
The libtbb.dylib file in the distribution is built from the TBB 4.3 Update 1
tbb43_20141023oss_src.tgz sources, and was built for the x86 and x86_64
architectures.
The default compiler is clang 1.7 which does not compile the TBB library
(because it has bugs when compiling C++ source code), so it needs to changed
to GCC 4.2.
Normally, the libtbb.dylib file is built so that for a client program to use
it, it would have to be placed in /usr/lib, and would therefore require
administrator privileges to install it onto a Mac OS X system. The version
included in this distribution does not need to be installed in /usr/lib, and
is therefore usable "out of the box" and portable (eg, can be run from a USB
thumb drive).
So to build it the same way as in the distribution, the macos.clang.inc file
needs to be modified with these lines:
WARNING_SUPPRESS = -Wno-non-virtual-dtor ### -Wno-dangling-else (no-dangling-else is clang-specific)
LIB_LINK_FLAGS = -dynamiclib -Wl,-install_name,@executable_path/$@ ### enables portable .dylib
ifeq (intel64,$(arch))
CPLUS = g++-4.2 ### because clang 1.7 cannot compile the TBB
CPLUS_FLAGS += -m64 -mmacosx-version-min=10.5
LINK_FLAGS += -m64 -mmacosx-version-min=10.5
LIB_LINK_FLAGS += -m64 -mmacosx-version-min=10.5
endif
ifeq (ia32,$(arch))
CPLUS = g++-4.2 ### because clang 1.7 cannot compile the TBB
CPLUS_FLAGS += -m32 -mmacosx-version-min=10.5
LINK_FLAGS += -m32 -mmacosx-version-min=10.5
LIB_LINK_FLAGS += -m32 -mmacosx-version-min=10.5
endif
Then build the x86 and x86_64 variants using:
cd <TBB-src>
make tbb arch=ia32 SDKROOT=/Developer/SDKs/MacOSX10.5.sdk
make tbb arch=intel64 SDKROOT=/Developer/SDKs/MacOSX10.5.sdk
Then create the final dylib using (this example is built on a 10.6.8 system):
cp ./build/macos_ia32_clang_cc4.2.1_os10.6.8_release/libtbb.dylib libtbb-x86.dylib
cp ./build/macos_intel64_clang_cc4.2.1_os10.6.8_release/libtbb.dylib libtbb-x86_64.dylib
lipo -create -o libtbb.dylib libtbb-x86.dylib libtbb-x86_64.dylib
strip -x libtbb.dylib
To build the executables, configure needs to be invoked in a particular manner for both x86 and x64 builds:
cd <par2_tbb_root>/build
../configure --build=i686-apple-darwin10.2.0 --host=i686-apple-darwin10.2.0 CXX=g++-4.2 && sed -e 's/CXXFLAGS = -g -O2/CXXFLAGS = #-g -O2/' Makefile > Makefile.tmp && mv Makefile.tmp Makefile && make && strip par2 && mv par2 par2-x86 && make clean
../configure --build=i686-apple-darwin10.2.0 --host=x86_64-apple-darwin10.2.0 CXX=g++-4.2 && sed -e 's/CXXFLAGS = -g -O2/CXXFLAGS = #-g -O2/' Makefile > Makefile.tmp && mv Makefile.tmp Makefile && make && strip par2 && mv par2 par2-x86_64 && make clean
lipo -create -o par2 par2-x86 par2-x86_64
Note: the distributed copies of the par2 and libtbb.dylib files are symbol stripped (using the 'strip'
command line tool) to reduce their size.
--- Building and installing on Windows operating systems ---
This modified version has been built and tested on Windows 7 using Visual Studio 2013.
It statically links with both the TBB and the C runtime library and the included
Makefile, Project and Solution files are set up to build in this manner. To build the
program, you need to build the TBB as a static library and then build par2.
[1] install Windows SDK v7.1 (only the Windows headers and libraries are required)
and Visual Studio 2013 for Windows Desktop or Visual Studio 2013 Community Edition
(only the C++ compilers, headers and libraries are required).
[2] extract the TBB source tarball into a directory, which will be referred to as <tbb>
in the instructions below
[3] in <tbb>/build, modify windows.inc:
# static library version of TBB does not need .def file:
#TBB.DEF = $(TBB.LST:.lst=.def)
# static library version of TBB should use .lib suffix:
#TBB.DLL = tbb$(CPF_SUFFIX)$(DEBUG_SUFFIX).$(DLL)
TBB.DLL = tbb$(CPF_SUFFIX)$(DEBUG_SUFFIX).$(LIBEXT)
# static library version of TBB does not need a version resource:
#TBB.RES = tbb_resource.res
# static library version of TBB uses lib.exe to build the library, not "cl.exe /DLL":
LIB_LINK_CMD = lib.exe
[4] in <tbb>/build, modify windows.cl.inc:
# static library version of TBB only needs to pass /nologo to lib.exe:
#LIB_LINK_FLAGS=/link /nologo /DLL /MAP /DEBUG /fixed:no /INCREMENTAL:NO /DYNAMICBASE /NXCOMPAT
LIB_LINK_FLAGS=/nologo
# static library version of TBB cannot pass /SAFESEH to lib.exe:
# LIB_LINK_FLAGS += /SAFESEH
# static library version of TBB asks lib.exe to output to tbb.lib or tbb_debug.lib:
#OUTPUT_KEY = /Fe
OUTPUT_KEY = /out:
[5] open Visual Studio 2013 -> Visual Studio Tools -> open a VS2013 x64 Cross Tools Command Prompt window
[6] modify these environment variables:
set INCLUDE=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Include;C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\INCLUDE;
set LIB=C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB\amd64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Lib\x64
[7] build a x64 (64-bit) version of the TBB using GNU make. If you do not have GNU make,
first download the source tarball for it and build it using its instructions.
Note the use of the vc_mt runtime, which asks to link the TBB library statically
with the C runtime library:
cd <tbb>
gmake.exe tbb runtime=vc_mt arch=intel64
[8] open Visual Studio 2013 -> Visual Studio Tools -> open a VS2013 x86 Native Tools Command Prompt window
[9] modify these environment variables:
set INCLUDE=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Include;C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\INCLUDE;
set LIB=C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\LIB;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Lib
[10] build a x86 (32-bit) version of the TBB using GNU make:
cd <tbb>
gmake.exe tbb runtime=vc_mt arch=ia32
[11] from here, you can either build par2 using a Visual C++ project or from the command line using
the Windows SDK make tool.
To build using the Visual C++ project, open the par2cmdline.sln solution file in Visual Studio
2013 for Windows Desktop (or the Community Edition), select the configuration you want to build,
and then build the program.
To build using the Windows SDK make tool, go back to the VS2013 x64 Cross Tools Command Prompt
window you opened in step [5] and do this to create the par2_win64.exe executable:
cd <par2>
nmake nodebug=1 arch=x64
del *.obj
Then go back to the VS2013 x86 Native Tools Tools Command Prompt window you opened in step [8]
and do this to create the par2_win32.exe executable:
cd <par2>
nmake nodebug=1 arch=x86
del *.obj
Note: the makefile assumes that the <par2> and <tbb> source folders are both in the same folder.
If this is not the case, change this line in the Makefile so that the linker can find the TBB
library you built above:
MY_TBB_DIR=../tbb43_20141023oss
--- Building and installing on FreeBSD ---
The instructions below are not needed if you use the FreeBSD ports system to
download, unpack, compile, link and install the program. Please see the
documentation in the ports system for instructions on its use. It is recommended
that the ports system be used to build the program since the source code can
build with modification. Please consider the following to be deprecated or for
educational use only.
Instructions for building without using the FreeBSD ports system:
[1] build and install TBB
- extract TBB from the source archive.
- on a command line, execute:
cp -r <TBB-src>/include/tbb /usr/local/include
cd <TBB-src> && /usr/local/bin/gmake
# change the next line to match your machine's configuration:
cp <TBB-src>/build/FreeBSD_em64t_gcc_cc4.1.0_kernel7.0_release/libtbb.so /usr/local/lib
[2] build and install par2cmdline-0.4-tbb
- extract and build par2cmdline-0.4-tbb using tar, ./configure, and make
- copy built binary to where you want to install it (eg, /usr/local/bin)
[3] cleanup
- remove <TBB-src> and par2cmdline-0.4-tbb source directories
--- Technical Details ---
All source code modifications have been isolated to blocks that have this form:
#if WANT_CONCURRENT
<code added for concurrency>
#else
<original code>
#endif
to make it easier to see what was modified and how it was done.
The technique used to modify the original code was:
[1] add timing code to instrument/document the places where concurrency would be of
benefit. The CTimeInterval class was used to time sections of the code.
[2] decide which functions to make concurrent, based on the timing information
obtained in step [1].
[3] for each function to make concurrent, study it and its sub-functions for
concurrent access problems (shared data points)
[4] read the Intel TBB tutorials and reference manual to learn how to use the
library to convert serial code to concurrent code
It was then decided to apply concurrency to:
- loading of recovery packets (par2 files), which necessitated changes to some member
variables in par2repairer.h:
- sourcefilemap [LoadDescriptionPacket, LoadVerificationPacket]
- recoverypacketmap [LoadRecoveryPacket]
- mainpacket [LoadMainPacket]
- creatorpacket [LoadCreatorPacket]
They were changed to use concurrent-safe containers/wrappers. To handle concurrent
access to pointer-based member variables, the pointers are wrapped in atomic<T>
wrappers. tbb::atomic<T> does not have operator-> which is needed to deference
the wrapped pointers so a sub-class of tbb::atomic<T> was created, named
atomic_ptr<T>. For maps and vectors, tbb's concurrent_hash_map and concurrent_vector
were used.
Because DiskFileMap needed to be accessed concurrently, a concurrent version of it
was created (class ConcurrentDiskFileMap)
- source file verification
- repairing data blocks
In the original version, progress information was written to cout (stdout) in a serial
manner, but the concurrent version would produce garbled overlapping output unless
output was made concurrent-safe. This was achieved in two ways: for simple infrequent
output routines, a simple mutex was used to gate access to cout to only one thread at
a time. For frequent use of cout, such as during the repair process, an atomic integer
variable was used to gate access, but *without* blocking a thread that would have
otherwise been blocked if a mutex had been used instead. The code used is:
if (0 == cout_in_use.compare_and_swap(outputendindex, 0)) { // <= this version doesn't block - only need 1 thread to write to cout
cout << "Processing: " << newfraction/10 << '.' << newfraction%10 << "%\r" << flush;
cout_in_use = 0;
}
Initially cout_in_use is set to zero so that the first thread to put its value of
outputendindex into cout_in_use will get a zero back from cout_in_use.compare_and_swap()
and therefore enter the 'true block' of the 'if' statement. Other threads that then try
to put their value of outputendindex into cout_in_use while the first thread is still
using cout will fail to do so and so they will skip the 'true block' but they won't block.
For par2 creation, similar modifications were made to the source code that also allowed
concurrent processing to occur.
To convert from serial to concurrent operation, for() loops were changed to using Intel
TBB parallel_for() calls, with a functor object (callback) supplied to provide the body
of the parallel for loop. To access member variable in the body of the parallel loop,
new member functions were added so that the functor's operator() could dispatch into the
original object to do the for loop body's processing.
It should be noted that there are two notable parts of the program that could not be
made concurrent: (1) file verification involves computing MD5 hashes for the entire file
but computing the hash is an inherently serial computation, and (2) computing the Reed-
Solomon matrix for use in creation or repair involves matrix multiplication over a Galois
field, which is also an inherently serial computation and so it too could not be made into
a concurrent operation.
Nevertheless, the majority of the program's execution time is spent either repairing the
lost data, or in creating the redundancy information for later repair, and both of these
operations were able to be made concurrent with a near twice speedup on the dual core
machines that the concurrent version was tested on.
Note that it is important that the computer has sufficient memory (1) to allow the caching
of data and (2) to avoid virtual memory swapping, otherwise the creation or repair process
will become I/O bound instead of CPU bound. Computers with 1 to 2GB of RAM should have
enough memory to not be I/O bound when creating or repairing parity/data files.
--- Version History ---
The changes in the 20141125 version are:
- when creating parity files, the main packet was not always being written to the parity
files when they were processed concurrently because the main packet was not being
safely appended to the list of packets to output because a non-thread-safe data
container (std::list<T>) was being used. This bug would manifest when a large number
of source files were being processed. Fixed by using tbb::concurrent_vector<T> instead
of std::list<T>.
- when creating parity files, the "Opening: <file>" messages will only be displayed for
the first n source files, where n defaults to 200. This restriction was added so that
creating parity files for a large number of source files would not cause a lot of
scrolling which in turn would make the processing take a long time. Use the new -z<n>
command line switch to set a different limit. Use -z0 to specify no limit.
- verification of extra files is now performed concurrently if requested to do so
(previously they were always verified serially)
- the -t parameter can now include a positive integer value to restrict the logical number
of CPUs with which to process data with. The different variants are:
-t- verifies, repairs, and creates serially (no change)
-t+ verifies, repairs, and creates concurrently (no change)
-t0 verifies serially and repairs/creates concurrently (no change)
-t-n verifies, repairs, and creates concurrently using the maximum number of logical
CPUs minus n, or 1 (whichever is larger) for n > 0; n <= 0 is illegal
-t+n verifies, repairs, and creates concurrently using the maximum number of logical
CPUs, or n (whichever is smaller) for n > 0; n <= 0 is illegal
-t0n verifies serially and repairs/creates concurrently using:
for n > 0: the maximum number of logical CPUs, or n (whichever is smaller)
for n < 0: the maximum number of logical CPUs minus n, or 1 (whichever is larger)
for n = 0: illegal
For example, -t-1 on a 6 logical CPU system will use up to 5 logical CPUs. On the
same system, -t-7 will use up to 1 logical CPU, ie, process serially.
- "up to" is used because there may not be enough data to use the maximum number of
logical CPUs.
- the maximum number of logical CPUs may be determined by the operating system or the
hypervisor and may be less than the actual number of physical CPU cores, eg, when
running in a virtual machine.
- in the Windows version, the program's CPU scheduling priority can now be specified
using the -p parameter:
-pN to process at normal priority (Normal in Task Manager) [default]
-pL to process at low priority (Below Normal in Task Manager)
-pI to process at idle priority (Low in Task Manager)
- the heap became fragmented during the verification of data files because the checksum
data buffer was allocated and deallocated for each file verified, which resulted in the
program's memory footprint (aka its "working set") steadily increasing during the
verification phase. This would result in the 32-bit Windows version failing to verify
large data sets because it could not allocate verification data buffers. To solve this,
the checksum data buffer is no longer allocated and deallocated for each file verified.
Instead, a pool of checksum objects is created and that pool of objects is then used and
re-used for verifying data files. The size of the pool matches the number of logical
CPUs which the program is asked to use. This change benefits all versions of the program
because by reducing heap fragmentation, larger data sets can be processed using less
virtual memory.
- numerous small code changes were made to remove unnecessary string copying. Such
redundant copying would further fragment the heap as well as use up memory for temporary
strings which did not need to be allocated in the first place.
- updated to Intel TBB 4.3 Update 1 (tbb43_20141023oss_src.tgz)
- removed use of MAX_PATH or other fixed-size path buffers to avoid buffer overflow errors
- the program failed to build under newer C++ standard libraries because they no longer
provide std::auto_ptr<T>. Fixed by either using std::unique_ptr<T> (if available) or by
providing our own version of std::auto_ptr<T>.
- the Mac OS x86 (32-bit) version now requires 10.5 or later
- stopped building the FreeBSD version because the FreeBSD ports system can now build the
par2 program and TBB library without requiring any changes to the sources of either and
because it isn't possible to build a "portable" version of the program, in the sense
that the TBB library cannot be in the same directory as the par2 executable - it must be
installed into /usr/lib/, and that is a job best left to the FreeBSD ports system.
The changes in the 20100203 version are:
- modified Makefile.am to use "ARCH_SCALAR" instead of "ARCH" to avoid a FreeBSD name clash
- fixed a 64-bit-only bug in reedsolomon-x86_64-mmx.s where a size of 8 bytes caused a segfault
(forgot to test for zero like the reedsolomon-i686-mmx.s file does); this bug only manifests in
the 64-bit Mac, 64-bit Linux and 64-bit FreeBSD versions; reproduced by creating/repairing a
file of exactly 16384 bytes
- updated to Intel TBB 2.2 (tbb22_20090809oss)
- the Mac build no longer includes the PowerPC variants (I don't use a PowerPC Mac anymore)
- the 32-bit and 64-bit Windows builds of both par2 and the TBB library are now statically
linked against the C runtime library to avoid the problem of requiring the installation of
the correct CRT library (DLL). As well, par2 is statically linked against the TBB library
to allow just one executable file to be installed (i.e., just par2.exe).
The changes in the 20090203 version are:
- fixed a bug which affected the Linux and Mac versions whereby repairs would fail if
the file being repaired was short or had one or two bad blocks (because the async write
to the file's last byte was failing).
- on Windows, the program now stores directory paths in par2 files using '/' as the path
separator instead of '\' (as per the Par 2.0 specification document). Note: directory
paths are stored only when the '-d' switch is used.
- merged the sources from the CPU-only and CPU/GPU versions so that both versions now
build from the same set of source files using different 'configure' options (Mac, Linux,
FreeBSD) or project files (Windows). See above for building instructions.
The changes in the 20081009 version are:
- added support for NVIDIA CUDA 2.0 technology, which allows the GPU on the video card to
be used to perform some of the processing workload in addition to the CPU on the mainboard.
See the "--- About the NVIDIA CUDA version ---" section in this file for limitations,
requirements, build instructions, licensing, and more information.
The changes in the 20081005 version are:
- asynchronous reading of a large number of small files would sometimes not complete which
caused the program to hang. Fixed by reverting to synchronous reading (most of the benefit
of async I/O is from async writing so this change does not affect overall performance).
- some operating systems have limits on the number of open files which was easily exceeded
when a large number of small files are being processed for par2 creation or for repair.
Fixed by closing the source files as soon as they are no longer needed to be opened (which
is determined by counting how many data blocks the file provides for creation/repair).
The changes in the 20080919 version are:
- added more information to a few of the error messages to make it easier to specify
block counts, etc. when using the -d option.
- redundancy can now be specified using floating point values instead of integral values,
eg, 8.5% instead of 8% or 9%.
- added the -0 option to create dummy par2 files. This was done so that the actual size
of the par2 files can be quickly determined. For example, suppose you wish to fill up
a CD-R's or DVD-R's remaining empty space with par2 files of the files filling up the
disc, then by using the -0 option, you can quickly work out whether the par2 files
will fit and by how much, which in turn allows you to maximize the use of the remaining
empty space (you would alter the block count number and/or size so that the optimal
number of blocks are created to fill up the remaining space). To determine how much
CD-R or DVD-R space you have to fill, find out how many blocks your blank disc has
(using a burning program such as ImgBurn [Windows]) and how many blocks your data
would occupy when burned (using an image creation program such as mkisofs [all
platforms] which has a handy -print-size option). ImgBurn [Windows] can also tell
you how many blocks you have for filling if you use its 'build' command.
WARNING: be careful when using this command that you don't burn the dummy par2 files
that it creates because they don't have any valid data in them. Remember, they are
created only to determine the actual size of the real par2 files that would be
created if you had not used the -0 option.
- added MMX-based code from Paul Houle's phpar2_12src version of par2cmdline-0.4. As
a result, the repair and creation of par2 files using x86 or x86_64 MMX code is about
20% faster than the scalar version in singlethreaded testing. Multithreaded testing
showed no noticable improvement (ie, YMMV). The scalar version is used if your CPU
is not MMX capable. MMX CPUs: Intel Pentium II and later, AMD Athlon64 and later.
- added asynchronous I/O for platforms that support such I/O: Mac OS X, Windows,
GNU/Linux. This results in a small (~1-5%) improvement in throughput, especially for
repairing. Unfortunately, using async I/O causes a crash under FreeBSD, so the
pre-built binaries are built to only use synchronous I/O.
- first release of 32-bit and 64-bit PowerPC binaries for Mac OS X. The 32-bit version
requires at least 10.4, and the 64-bit version requires at least 10.5. The 64-bit
version is UNTESTED (because of lack of access to a G5 Mac).
- first release of a 64-bit x86_64 binary for GNU/Linux. Tested under the 64-bit
version of Gentoo 2008.0.
- the 64-bit Windows binary is built using the tbb20_20080408oss release of the TBB;
the Mac, GNU/Linux, FreeBSD and 32-bit Windows binaries are built using the
tbb21_009oss release of the TBB. The tbb21_009oss release does not support the
VC7.1 runtime libraries on Win64 so it was necessary to fallback to a previous
version for the Windows 64-bit binary.
The changes in the 20080420 version are:
- added the -t0 option to allow verification to be done serially but still perform
repair concurrently, and for creation, MD5 checksumming will be done serially
and par2 data creation will be done concurrently. The default is to perform
all operations concurrently, so if you want the new behaviour, you will need to
manually specify -t0 on the command line or build your own custom version of
the executable.
- if the realpath() API returned NULL, the par2 files created would end up with
the name of the first file in the list of files to create par2 files for. Fixed.
- no longer includes duplicate file names in the list of files to create redundancy
data for (which would otherwise bloat the .par2 files)
- now displays the instruction set being executed
- updated to use the tbb20_017oss_src.tar.gz version of the Intel TBB library.
The changes in the 20080203 version are:
- the Linux version wasn't working because it was not built correctly: the
reedsolomon-inner-i386-posix.s was using an incorrect include directive. Fixed.
*** WARNING ***
A consequence of this error is that par2 files created with the 20080116 Linux
binary contain incorrect repair data and therefore cannot be used to repair
data files. The par2 files will need to be created again using either the
20071128 build of the Linux binary or this build of it.
*** WARNING ***
- tweaked the Makefile and par2cmdline.h to allow for building under FreeBSD.
- first release of 32-bit and 64-bit binaries for FreeBSD (built under RELEASE 6.2).
- updated to use the 20080115 version of the Intel TBB library.
The changes in the 20080116 version are:
- the initial processing (creation) and verification (repair) of target files
is now performed serially because of complaints that concurrent processing
was causing disk thrashing. Since this part of the program's operation is
mostly I/O bound, the change back to serial processing is a reasonable change.
- full paths are now only displayed when a -d parameter is given to the
program, otherwise the original behavior of displaying just the file name
now occurs.
- Unicode support was added. This requires some explanation.
Windows version: previous versions processed file names and directory
paths using the default code page for non-Unicode programs, which is
typically whatever the current locale setting is. In other words,
file names that had characters that could not be represented in the
default code page ended up being mangled by the program, resulting
in .par2 files which contained mangled file names (directory names
also suffered mangling). Such .par2 files could not be used on other
computers unless they also used the same code page, which for POSIX
systems is very unlikely. The correct solution is to store and retrieve
all file names and directory paths using a Unicode representation.
To keep some backward compatibility, the names should be stored in
an 8-bit-per-character format (so that older .par2 files can still
be processed by the program), so decomposed (a.k.a. composite) UTF-8
was chosen as the canonical file name encoding for the storage of
file names and directory paths in .par2 files.
To implement this change, the Windows version now takes all file
names from the operating system as precomposed UTF-16 and converts
them to decomposed UTF-8 strings which are stored in memory and
in .par2 files. If the operating system needs to use the string,
it is converted back into precomposed UTF-16 and then passed to
the OS for use.
POSIX version: it is assumed that the operating system will deliver
and accept decomposed (a.k.a. composite) UTF-8 characters to/from
the program so no conversion is performed. Darwin / Mac OS X is
one such system that passes and accepts UTF-8 character strings, so
the Mac OS X version of the program works correctly with .par2
files containing Unicode file names. If the operating system
does not deliver nor accept decomposed UTF-8 character strings,
this version (and previous versions) will not create .par2 files
that contain Unicode file names or directory paths, and which
will cause mangled file/directory names when used on other
operating systems.
Summary:
[1] for .par2 files created on Windows using a version of
this program prior to this version and which contain non-ASCII
characters (characters outside the range of 0 - 127 (0x00 - 0x7F)
in numeric value, this program will be able to use such files
but will probably complain about missing files or will create
repaired files using the wrong file name or directory path, ie,
file name mangling will occur.
[2] for .par2 files created on UTF-8 based operating systems
using a prior version of this program, this version will be
able to correctly use such files (ie, the changes made to the
program should not cause any change in behavior, and no file
name mangling will occur).
[3] for .par2 files created on non-UTF-8 based operating systems
using a prior version of this program, this version will be
able to use such files but file name mangling will occur.
[4] for .par2 files created on UTF-8 based operating systems
using this version of this program, file name mangling will
not occur.
[5] for .par2 files created on non-UTF-8 based operating systems
using this version of this program, file name mangling will
occur.
- split up the reedsolomon-inner.s file so that it builds
correctly under Darwin and other POSIX systems.
- changed the way the pre-built Mac OS X version is built because
the 64-bit version built under 10.4 (1) crashes when it is run
under 10.5, and (2) does not read par2 files when the files
reside on a SMB server (ie, a shared folder on a Windows
computer) because 10.4's SMB client software appears to
incorrectly service 64-bit client programs. These problems only
occurred with the 64-bit version; the 32-bit version works
correctly.
To solve both of these problems, the pre-built executable is now
released containing both a 32-bit executable built under 10.4
and a 64-bit executable built under 10.5. When run under 10.4,
the 64-bit executable does not execute because it is linked
against the 10.5 system libraries, so under 10.4, only the
32-bit executable is executed, which solves problem (2). When
run under 10.5 on a 64-bit x86 computer, the 64-bit executable
executes, which solves problem (1), and because 10.5's SMB
client correctly services 64-bit client programs, problem (2)
is solved.
The changes in the 20071128 version are:
- if par2 was asked to verify/repair with just a single .par2 file, it would
crash. Fixed.
- built for GNU/Linux using the Gentoo distribution (i386 version).
- updated to use the 20071030 version of the Intel TBB library.
The changes in the 20071121 version are:
- changed several concurrent loops from using TBB's parallel_for to
parallel_while so that files will be processed in a sequential (but
still concurrent/threaded) manner. For example, 100 files were
previously processed on dual core machines as:
Thread 1: file 1, file 2, file 3, ..., file 50
Thread 2: file 50, file 51, file 52, ..., file 100
which caused hard disk head thrashing. Now the threads will
process the files from file 1 to file 100 on a
first-come-first-served basis.
- limited the rate at which cout was called to at most 10 times per
second.
- when building for i386 using GCC, this version will now build
with an assembler version of the inner Reed-Solomon loop because
the code generated by GCC was not as fast/small as the Visual
C++ version. Doing this should bring the GCC-built (POSIX)
version's speed up to that of the Visual C++ (Windows) version.
- for canonicalising paths on POSIX systems, the program will now
try to use the realpath() API, if it's available, instead of the
fragile code in the original version.
- on POSIX systems, attempting to use a parameter of "-d." for par2
creation would cause the program to fail because it was not
resolving a partial path to a canonical full path. Fixed.
The changes in the 20071022 version are:
- synchronised the sources with the version of par2cmdline in the CVS at <http://sourceforge.net/projects/parchive>
- built against the 20070927 version of the Intel TBB
- tweaked the inner loop of the Reed Solomon code so that the compiler
will produce faster/better/smaller code (which may or may not speed up
the program).
- added support for creating and repairing data files in directory trees
via the new -d<directory> command line switch.
The original modifications for this were done by Pacer:
<http://www.quickpar.co.uk/forum/viewtopic.php4?t=460&amp;start=0&amp;postdays=0&amp;postorder=asc&amp;highlight=&amp>
This version defaults to the original behaviour of par2cmdline: if no
-d switch is provided then the data files are expected to be in the same
directory that the .par2 files are in.
Providing a -d switch will change the way that par2cmdline behaves as follows.
For par2 creation, any file inside the provided <directory> will have
its sub-path stored in the par2 files. For par2 repair, files for
verification/repair will be searched for inside the provided <directory>.
Example:
in /users/home/vincent/pictures/ there is
2007_01_vacation_fiji
01.jpg
02.jpg
03.jpg
04.jpg
2007_03_business_trip_usa
01.jpg
02.jpg
2007_06_wedding
01.jpg
02.jpg
03.jpg
04.jpg
05.jpg
06.jpg
Using the command:
./par2 c -d/users/home/vincent/pictures/ /users/home/vincent/pictures.par2 /users/home/vincent/pictures
will create par2 files in /users/home/vincent containing sub-paths such as:
2007_01_vacation_fiji/01.jpg
2007_01_vacation_fiji/02.jpg
2007_01_vacation_fiji/03.jpg
2007_01_vacation_fiji/04.jpg
2007_03_business_trip_usa/01.jpg
2007_03_business_trip_usa/02.jpg
2007_06_wedding/01.jpg
etc. etc.
If you later try to repair the files which are now in /users/home/joe/pictures,
you would use the command:
./par2 r -d/users/home/joe/pictures/ /users/home/joe/pictures.par2
The par2 file could be anywhere on your disk: as long as the -d<directory>
switch specifies the root of the files, the verification/repair will occur correctly.
Notes:
[1] the directory given to -d does not need to have a trailing '/' character.
[2] on Windows, either / or \ can be used.
[3] partial paths can be used. For example, if the current directory is
/users/home/vincent, then this be used instead of the above command:
./par2 c -dpictures pictures.par2 pictures
[4] if a directory has spaces or other characters that need escaping from the
shell then the use of double quotes is recommended. For example:
./par2 c "-dpicture collection" "picture collection.par2" "picture collection"
The changes in the 20070927 version are:
- applied a fix for a bug reported by user 'shenhanc' in
Par2CreatorSourceFile.cpp where a loop variable would not get
incremented when silent output was requested.
The changes in the 20070926 version are:
- fixed an integer overflow bug in Par2CreatorSourceFile.cpp which resulted
in incorrect MD5 hashes being stored in par2 files when they were created
from source files that were larger than or equal to 4GB in size. This bug
affected all 32-bit builds of the program. It did not affect the 64-bit
builds on those platforms where sizeof(size_t) == 8.
The changes in the 20070924 version are:
- the original par2cmdline-0.4 sources were not able to process files
larger than 2GB on the Win32 platform because diskfile.cpp used the
stat() function which only returns a signed 32-bit number on Win32.
This was changed to use _stati64() which returns a proper 64-bit file
size. Note that the FAT32 file system from the Windows 95 era does not
support files larger than 1 GB so this change is really applicable only
to files on NTFS disks - the default file system on Windows 2000/XP/Vista.
The changes in the 20070831 version are:
- modified to utilise Intel TBB 2.0.
Vincent Tan.
November 25, 2014.
<chuchusoft@gmail.com>
//
// Modifications for concurrent processing, Unicode support, and hierarchial
// directory support are Copyright (c) 2007-2014 Vincent Tan.
// Search for "#if WANT_CONCURRENT" for concurrent code.
// Concurrent processing utilises Intel Thread Building Blocks 4.3 Update 1,
// Copyright (c) 2007-2014 Intel Corp.
//

View File

Binary file not shown.