Compare commits

...

66 Commits

Author SHA1 Message Date
advplyr
90f4833c9e Version bump v2.7.2 2024-01-16 17:26:50 -06:00
advplyr
c0cb3a176f Update:Hide audiobook tools for windows install, remove debian folder picker alert 2024-01-16 17:19:45 -06:00
advplyr
7b0fa48e2e Update jsdocs for expanded library items 2024-01-16 16:31:16 -06:00
advplyr
b51853b3df Update:Use raw cover art for media session #2514 2024-01-15 08:34:12 -06:00
advplyr
f5545cd3f4 Add:Scanner extracts cover from comic files #1837 and ComicInfo.xml parser 2024-01-14 17:51:26 -06:00
advplyr
e76af3bfc2 Fix comic page menu dropdown highlight correct page 2024-01-13 16:41:13 -06:00
advplyr
850397e4c1 Add:Playlist button to podcast episodes on latest page #2455 2024-01-12 17:58:07 -06:00
advplyr
e8fa029df7 Fix:Specific podcast rss feed cannot be fetched due to accept header #2446 2024-01-10 08:12:26 -06:00
advplyr
1a361c91f1 Merge pull request #2506 from FreedomBen/remove-dev-logs
Change `Logger.dev` calls to `Logger.debug`
2024-01-09 16:47:21 -06:00
Benjamin Porter
4a76059608 Change Logger.dev calls to Logger.debug
Logger.dev is kind of in a weird spot where it doesn't fit into the
standard log level.  It is called directly by some code and it only
checks whether a property is set (which comes from an env var) before
deciding to print out.

This standardizes on `debug` by changing the dev calls to debug. Also
removes the now unused code.
2024-01-09 15:24:23 -07:00
advplyr
da25eff5c1 Fix:Parse series sequence from OPF in cases where series_index is not directly underneath series meta #2505 2024-01-08 18:21:15 -06:00
advplyr
69e23ef9f2 Add:Epub metadata parser and cover extractor #1479 2024-01-07 17:51:07 -06:00
advplyr
48a08e9659 Merge pull request #2503 from Machou/master
Little Missed Update fr.json
2024-01-07 12:42:09 -06:00
Machou
4608f91ec6 Update fr.json 2024-01-07 02:41:16 +01:00
advplyr
e88c1fa329 Update:Show tooltip for library item card titles that are truncated #2451
- Refactored tooltip so that they dont overflow the window
2024-01-06 15:54:48 -06:00
advplyr
935e545caa Update readme for iOS beta full 2024-01-06 14:13:39 -06:00
advplyr
a426da534c Fix:Export OPML not escaping characters #2487 2024-01-05 14:45:25 -06:00
advplyr
eaf6bf29cc Fix:Improve performance for podcast rss feed episodes modal for large rss feeds 2024-01-05 14:39:25 -06:00
advplyr
a0eb6bd3dc Fix:Refresh podcast episode table when new episodes are downloaded 2024-01-05 14:38:29 -06:00
advplyr
fbe228a4f8 Merge pull request #2485 from Machou/patch-1
Update fr.json
2024-01-05 10:40:29 -06:00
advplyr
578a59063f Update discord invite link 2024-01-05 09:24:18 -06:00
Machou
ffa7cc0d22 Update fr.json 2024-01-05 07:19:07 +01:00
advplyr
4f9969cd9b Merge pull request #2488 from FreedomBen/add-init-system-to-docker
Add tini as PID 1 handler in container image
2024-01-04 13:50:54 -06:00
advplyr
9f909b0d85 Update:Library folder browser to also work for debian and windows 2024-01-03 16:23:17 -06:00
Benjamin Porter
baa65b8155 Add tini as PID 1 handler in container image
This PR adds `tini` to the container image and uses it as PID 1 when
starting the container.  This ensures that proper PID 1 signal-handling
is implemented and passed to the underlying node.js process, thereby
ensuring that the ABS process has a chance to receive and handle signals
other than `SIGKILL`, such as the important `SIGINT`.

This is somewhat related to #2445 . Without this, the signal handled by
2445 won't be received when running in a container.

Some background:

In linux, PID 1 has special duties involving signal handling that are
different than other processes.  Node doesn't properly handle these
signals, which can lead to a number of problems ranging from annoying to
disruptive.  PID 1 also has reaping duties that can lead to resource
exhaustion if not properly handled.

For example, the container ignores `SIGINT` (Ctrl+C) as well as `docker stop`,
which can be annoying in development as you have to kill or wait for
the timeout to be reached.  In a production environment (such as Kubernetes)
this can lead to signal escalation and unnecessarily adds delays to
deployments and restarts as K8s has to wait for the timeout to be reached
before sending `SIGKILL`.

At best this is annoying and unnecessarily adds
delays.  At worst this can lead to file/data corruption as the process
doesn't get a chance to clean anything up when it is sent `SIGKILL`.
Without a proper PID 1 to forward signals, only SIGKILL can be used to
terminate the running process.
2024-01-03 13:55:43 -07:00
Machou
a1e321b153 Update fr.json 2024-01-03 20:16:21 +01:00
advplyr
8c6a2ac5dd Merge pull request #2391 from mikiher/binary-manager
Add a binary manager that finds ffmpeg and ffprobe and installs them if not found
2024-01-02 14:25:56 -06:00
advplyr
b489bf9236 Restrict binary manager to Windows or development 2024-01-02 14:24:59 -06:00
advplyr
aa63aa6cf3 Merge branch 'master' into binary-manager 2024-01-02 14:16:27 -06:00
advplyr
9a2b93fb37 Version bump v2.7.1 2023-12-31 15:37:23 -06:00
advplyr
e8ea7efc98 Merge branch 'master' of https://github.com/advplyr/audiobookshelf 2023-12-31 15:36:37 -06:00
advplyr
81a76593da Fix:Merging chapters from multiple audio files with the same chapter titles #2461 2023-12-31 15:35:17 -06:00
advplyr
5336864f7d Merge pull request #2465 from thevoltagesource/getFileMtimeMs_Unhandled_Exception
Add try/catch to fileUtils.getFileMtimeMs
2023-12-31 15:34:43 -06:00
advplyr
d38058e1d2 Fix:Podcast episode time remaining shown on button showing 0 seconds after toggling mark as finished 2023-12-31 15:32:44 -06:00
advplyr
fececd4651 Fix:Playlists navigation button not showing on mobile screen #2469 2023-12-31 15:09:35 -06:00
advplyr
021adf3104 Update:Podcast episode table is lazy loaded #1549 2023-12-31 14:51:01 -06:00
advplyr
160c83df4a Update:podcastEpisodes table index added for createdAt column #2073 #2075 2023-12-30 16:14:14 -06:00
advplyr
456bb87a00 Update:Find one library item endpoint sequelize query split into two queries to improve performance #2073 #2075 2023-12-30 12:12:48 -06:00
advplyr
707451309c Merge branch 'master' of https://github.com/advplyr/audiobookshelf 2023-12-29 17:05:40 -06:00
advplyr
269676e8a5 Update:CORS for /cover API endpoint for use in canvas in the mobile apps 2023-12-29 17:05:35 -06:00
Jacob Southard
e4effebc19 Add try/catch to fileutils.getFileMtimeMs 2023-12-29 10:04:59 -06:00
advplyr
fbbceddba8 Merge pull request #2454 from mikiher/socket-authority-close
Add SocketAuthority.close()
2023-12-28 16:32:40 -06:00
advplyr
9a634e0de5 Add JS docs for server stop 2023-12-28 16:32:21 -06:00
mikiher
21d0d43edc Add SocketAuthority.close() 2023-12-27 15:33:33 +02:00
mikiher
3051b963ef Merge branch 'advplyr:master' into binary-manager 2023-12-27 06:44:22 +02:00
advplyr
0d0bdce337 Fix:Fetch RSS feed request accept header #2446 2023-12-25 13:15:55 -06:00
advplyr
bdb5dc8c28 Merge pull request #2445 from mikiher/sigint-handler
Add a SIGINT handler for proper server shutdown
2023-12-25 12:51:22 -06:00
mikiher
209847d98a Add a SIGINT handler for proper server shutdown 2023-12-25 09:25:04 +02:00
advplyr
14f42e15d1 Fix:Book scanner update book series sequence if changed 2023-12-24 11:53:57 -06:00
advplyr
7402e4811d Merge pull request #2444 from jedrus2000/opf-multiple-series-support
Add: OPF file supports multiple series as sequence of : calibre:series and calibre:series_index; including tests
2023-12-24 11:42:06 -06:00
advplyr
6de0465b86 Update opf parser to ignore series with empty content and add tests 2023-12-24 11:41:27 -06:00
Andrzej Bargański
cd7c4baaaf Add: OPF file supports multiple series as sequence of : calibre:series and calibre:series_index; including tests 2023-12-24 00:43:42 +01:00
mikiher
8f7a420cca Fix directory writable check (fs.access not working on Windows) 2023-12-14 09:47:18 +02:00
advplyr
6f6395bad7 Only log update binary env path if it was updated 2023-12-07 17:32:06 -06:00
mikiher
6afb8de3dd Remove ffbinaries local cache 2023-12-08 00:53:53 +02:00
mikiher
0e62ccc7aa Merge branch 'binary-manager' of https://github.com/mikiher/audiobookshelf into binary-manager 2023-12-07 23:51:33 +02:00
mikiher
09282a9a62 Remove all callbacks and refactor spaghetti code in downloadUrls 2023-12-07 23:49:46 +02:00
advplyr
18b3ab5610 Revert package-lock updates 2023-12-07 15:12:49 -06:00
mikiher
699a658df9 Remove debug printing from libs/ffbinaries 2023-12-07 08:50:45 +02:00
mikiher
67ccd2c1fb Fix test after switching to libs/ffbinaries 2023-12-06 13:45:28 +02:00
mikiher
898b072e68 Merge branch 'advplyr:master' into binary-manager 2023-12-06 09:27:17 +02:00
advplyr
61a0126278 Remove ffbinaries dependency 2023-12-05 17:35:57 -06:00
advplyr
1ce1904c89 Add ffbinaries lib 2023-12-05 17:35:15 -06:00
mikiher
c074c835d4 Remove semicolons from test 2023-12-05 22:18:37 +02:00
mikiher
2e989fbe83 Add BinaryManager 2023-12-05 21:19:17 +02:00
mikiher
b1b325d00b Add ffbinaries dependency 2023-12-05 21:18:30 +02:00
78 changed files with 2836 additions and 520 deletions

View File

@@ -11,7 +11,7 @@ body:
value: "### Mobile app issues report [here](https://github.com/advplyr/audiobookshelf-app/issues/new/choose)."
- type: markdown
attributes:
value: "### Join the [discord server](https://discord.gg/pJsjuNCKRq) for questions or if you are not sure about a bug."
value: "### Join the [discord server](https://discord.gg/HQgCbd6E75) for questions or if you are not sure about a bug."
- type: markdown
attributes:
value: "## Be as descriptive as you can. Include screenshots, error logs, browser, file types, everything you can think of that might be relevant."

View File

@@ -1,7 +1,7 @@
blank_issues_enabled: false
contact_links:
- name: Discord
url: https://discord.gg/pJsjuNCKRq
url: https://discord.gg/HQgCbd6E75
about: Ask questions, get help troubleshooting, and join the Abs community here.
- name: Matrix
url: https://matrix.to/#/#audiobookshelf:matrix.org

2
.gitignore vendored
View File

@@ -13,6 +13,8 @@
/deploy/
/coverage/
/.nyc_output/
/ffmpeg*
/ffprobe*
sw.*
.DS_STORE

View File

@@ -18,7 +18,8 @@ RUN apk update && \
ffmpeg \
make \
python3 \
g++
g++ \
tini
COPY --from=tone /usr/local/bin/tone /usr/local/bin/
COPY --from=build /client/dist /client/dist
@@ -31,4 +32,5 @@ RUN apk del make python3 g++
EXPOSE 80
ENTRYPOINT ["tini", "--"]
CMD ["node", "index.js"]

View File

@@ -22,6 +22,10 @@
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 17V7m0 10a2 2 0 01-2 2H5a2 2 0 01-2-2V7a2 2 0 012-2h2a2 2 0 012 2m0 10a2 2 0 002 2h2a2 2 0 002-2M9 7a2 2 0 012-2h2a2 2 0 012 2m0 10V7m0 10a2 2 0 002 2h2a2 2 0 002-2V7a2 2 0 00-2-2h-2a2 2 0 00-2 2" />
</svg>
</nuxt-link>
<nuxt-link v-if="showPlaylists" :to="`/library/${currentLibraryId}/bookshelf/playlists`" class="flex-grow h-full flex justify-center items-center" :class="isPlaylistsPage ? 'bg-primary bg-opacity-80' : 'bg-primary bg-opacity-40'">
<p v-if="isPlaylistsPage || isPodcastLibrary" class="text-sm">{{ $strings.ButtonPlaylists }}</p>
<span v-else class="material-icons-outlined text-lg">queue_music</span>
</nuxt-link>
<nuxt-link v-if="isBookLibrary" :to="`/library/${currentLibraryId}/bookshelf/collections`" class="flex-grow h-full flex justify-center items-center" :class="isCollectionsPage ? 'bg-primary bg-opacity-80' : 'bg-primary bg-opacity-40'">
<p v-if="isCollectionsPage" class="text-sm">{{ $strings.ButtonCollections }}</p>
<span v-else class="material-icons-outlined text-lg">collections_bookmark</span>
@@ -293,6 +297,9 @@ export default {
}
return items
},
showPlaylists() {
return this.$store.state.libraries.numUserPlaylists > 0
}
},
methods: {

View File

@@ -349,7 +349,7 @@ export default {
}
if ('mediaSession' in navigator) {
var coverImageSrc = this.$store.getters['globals/getLibraryItemCoverSrc'](this.streamLibraryItem, '/Logo.png')
var coverImageSrc = this.$store.getters['globals/getLibraryItemCoverSrc'](this.streamLibraryItem, '/Logo.png', true)
const artwork = [
{
src: coverImageSrc

View File

@@ -8,10 +8,10 @@
<!-- Alternative bookshelf title/author/sort -->
<div v-if="isAlternativeBookshelfView || isAuthorBookshelfView" class="absolute left-0 z-50 w-full" :style="{ bottom: `-${titleDisplayBottomOffset}rem` }">
<div :style="{ fontSize: 0.9 * sizeMultiplier + 'rem' }">
<div class="flex items-center">
<span class="truncate">{{ displayTitle }}</span>
<ui-tooltip :text="displayTitle" :disabled="!displayTitleTruncated" direction="bottom" :delayOnShow="500" class="flex items-center">
<p ref="displayTitle" class="truncate">{{ displayTitle }}</p>
<widgets-explicit-indicator :explicit="isExplicit" />
</div>
</ui-tooltip>
</div>
<p class="truncate text-gray-400" :style="{ fontSize: 0.8 * sizeMultiplier + 'rem' }">{{ displayLineTwo || '&nbsp;' }}</p>
<p v-if="displaySortLine" class="truncate text-gray-400" :style="{ fontSize: 0.8 * sizeMultiplier + 'rem' }">{{ displaySortLine }}</p>
@@ -164,6 +164,7 @@ export default {
imageReady: false,
selected: false,
isSelectionMode: false,
displayTitleTruncated: false,
showCoverBg: false
}
},
@@ -642,6 +643,12 @@ export default {
}
this.libraryItem = libraryItem
this.$nextTick(() => {
if (this.$refs.displayTitle) {
this.displayTitleTruncated = this.$refs.displayTitle.scrollWidth > this.$refs.displayTitle.clientWidth
}
})
},
clickCard(e) {
if (this.processing) return

View File

@@ -2,8 +2,11 @@
<div class="w-full h-full overflow-hidden overflow-y-auto px-4 py-6">
<p class="text-xl font-semibold mb-2">{{ $strings.HeaderAudiobookTools }}</p>
<!-- alert for windows install -->
<widgets-alert v-if="isWindowsInstall" type="warning" class="my-8 text-base">Not supported for the Windows install yet</widgets-alert>
<!-- Merge to m4b -->
<div v-if="showM4bDownload" class="w-full border border-black-200 p-4 my-8">
<div v-if="showM4bDownload && !isWindowsInstall" class="w-full border border-black-200 p-4 my-8">
<div class="flex flex-wrap items-center">
<div>
<p class="text-lg">{{ $strings.LabelToolsMakeM4b }}</p>
@@ -19,22 +22,8 @@
</div>
</div>
<!-- Split to mp3 -->
<!-- <div v-if="showMp3Split" class="w-full border border-black-200 p-4 my-8">
<div class="flex items-center">
<div>
<p class="text-lg">{{ $strings.LabelToolsSplitM4b }}</p>
<p class="max-w-sm text-sm pt-2 text-gray-300">{{ $strings.LabelToolsSplitM4bDescription }}</p>
</div>
<div class="flex-grow" />
<div>
<ui-btn :disabled="true">{{ $strings.MessageNotYetImplemented }}</ui-btn>
</div>
</div>
</div> -->
<!-- Embed Metadata -->
<div v-if="mediaTracks.length" class="w-full border border-black-200 p-4 my-8">
<div v-if="mediaTracks.length && !isWindowsInstall" class="w-full border border-black-200 p-4 my-8">
<div class="flex items-center">
<div>
<p class="text-lg">{{ $strings.LabelToolsEmbedMetadata }}</p>
@@ -122,6 +111,12 @@ export default {
},
isEncodeTaskRunning() {
return this.encodeTask && !this.encodeTask?.isFinished
},
isWindowsInstall() {
return this.Source == 'windows'
},
Source() {
return this.$store.state.Source
}
},
methods: {

View File

@@ -31,7 +31,7 @@
<ui-btn class="w-full mt-2" color="primary" @click="browseForFolder">{{ $strings.ButtonBrowseForFolder }}</ui-btn>
</div>
</div>
<modals-libraries-folder-chooser v-else :paths="folderPaths" @back="showDirectoryPicker = false" @select="selectFolder" />
<modals-libraries-lazy-folder-chooser v-else :paths="folderPaths" @back="showDirectoryPicker = false" @select="selectFolder" />
</div>
</template>

View File

@@ -4,35 +4,37 @@
<span class="material-icons text-3xl cursor-pointer hover:text-gray-300" @click="$emit('back')">arrow_back</span>
<p class="px-4 text-xl">{{ $strings.HeaderChooseAFolder }}</p>
</div>
<div v-if="allFolders.length" class="w-full bg-primary bg-opacity-70 py-1 px-4 mb-2">
<p class="font-mono truncate">{{ selectedPath || '\\' }}</p>
<div v-if="rootDirs.length" class="w-full bg-primary bg-opacity-70 py-1 px-4 mb-2">
<p class="font-mono truncate">{{ selectedPath || '/' }}</p>
</div>
<div v-if="allFolders.length" class="flex bg-primary bg-opacity-50 p-4 folder-container">
<div v-if="rootDirs.length" class="relative flex bg-primary bg-opacity-50 p-4 folder-container">
<div class="w-1/2 border-r border-bg h-full overflow-y-auto">
<div v-if="level > 0" class="w-full p-1 cursor-pointer flex items-center" @click="goBack">
<div v-if="level > 0" class="w-full p-1 cursor-pointer flex items-center hover:bg-white/10" @click="goBack">
<span class="material-icons bg-opacity-50 text-yellow-200" style="font-size: 1.2rem">folder</span>
<p class="text-base font-mono px-2">..</p>
</div>
<div v-for="dir in _directories" :key="dir.path" class="dir-item w-full p-1 cursor-pointer flex items-center hover:text-white text-gray-200" :class="dir.className" @click="selectDir(dir)">
<div v-for="dir in _directories" :key="dir.path" class="dir-item w-full p-1 cursor-pointer flex items-center hover:text-white text-gray-200 hover:bg-white/10" :class="dir.className" @click="selectDir(dir)">
<span class="material-icons bg-opacity-50 text-yellow-200" style="font-size: 1.2rem">folder</span>
<p class="text-base font-mono px-2 truncate">{{ dir.dirname }}</p>
<span v-if="dir.dirs && dir.dirs.length && dir.path === selectedPath" class="material-icons" style="font-size: 1.1rem">arrow_right</span>
<span v-if="dir.path === selectedPath" class="material-icons" style="font-size: 1.1rem">arrow_right</span>
</div>
</div>
<div class="w-1/2 h-full overflow-y-auto">
<div v-for="dir in _subdirs" :key="dir.path" :class="dir.className" class="dir-item w-full p-1 cursor-pointer flex items-center hover:text-white text-gray-200" @click="selectSubDir(dir)">
<div v-for="dir in _subdirs" :key="dir.path" :class="dir.className" class="dir-item w-full p-1 cursor-pointer flex items-center hover:text-white text-gray-200 hover:bg-white/10" @click="selectSubDir(dir)">
<span class="material-icons bg-opacity-50 text-yellow-200" style="font-size: 1.2rem">folder</span>
<p class="text-base font-mono px-2 truncate">{{ dir.dirname }}</p>
</div>
</div>
<div v-if="loadingDirs" class="absolute inset-0 w-full h-full flex items-center justify-center bg-black/10">
<ui-loading-indicator />
</div>
</div>
<div v-else-if="loadingFolders" class="py-12 text-center">
<div v-else-if="initialLoad" class="py-12 text-center">
<p>{{ $strings.MessageLoadingFolders }}</p>
</div>
<div v-else class="py-12 text-center max-w-sm mx-auto">
<p class="text-lg mb-2">{{ $strings.MessageNoFoldersAvailable }}</p>
<p class="text-gray-300 mb-2">{{ $strings.NoteFolderPicker }}</p>
<p v-if="isDebian" class="text-red-400">{{ $strings.NoteFolderPickerDebian }}</p>
</div>
<div class="w-full py-2">
@@ -51,11 +53,12 @@ export default {
},
data() {
return {
loadingFolders: false,
allFolders: [],
initialLoad: false,
loadingDirs: false,
isPosix: true,
rootDirs: [],
directories: [],
selectedPath: '',
selectedFullPath: '',
subdirs: [],
level: 0,
currentDir: null,
@@ -89,68 +92,91 @@ export default {
...d
}
})
},
isDebian() {
return this.Source == 'debian'
},
Source() {
return this.$store.state.Source
}
},
methods: {
goBack() {
var splitPaths = this.selectedPath.split('\\').slice(1)
var prev = splitPaths.slice(0, -1).join('\\')
async goBack() {
let selPath = this.selectedPath.replace(/^\//, '')
var splitPaths = selPath.split('/')
var currDirs = this.allFolders
for (let i = 0; i < splitPaths.length; i++) {
var _dir = currDirs.find((dir) => dir.dirname === splitPaths[i])
if (_dir && _dir.path.slice(1) === prev) {
this.directories = currDirs
this.selectDir(_dir)
return
} else if (_dir) {
currDirs = _dir.dirs
}
let previousPath = ''
let lookupPath = ''
if (splitPaths.length > 2) {
lookupPath = splitPaths.slice(0, -2).join('/')
}
previousPath = splitPaths.slice(0, -1).join('/')
if (!this.isPosix) {
// For windows drives add a trailing slash. e.g. C:/
if (!this.isPosix && lookupPath.endsWith(':')) {
lookupPath += '/'
}
if (!this.isPosix && previousPath.endsWith(':')) {
previousPath += '/'
}
} else {
// Add leading slash
if (previousPath) previousPath = '/' + previousPath
if (lookupPath) lookupPath = '/' + lookupPath
}
this.level--
this.subdirs = this.directories
this.selectedPath = previousPath
this.directories = await this.fetchDirs(lookupPath, this.level)
},
selectDir(dir) {
async selectDir(dir) {
if (dir.isUsed) return
this.selectedPath = dir.path
this.selectedFullPath = dir.fullPath
this.level = dir.level
this.subdirs = dir.dirs
this.subdirs = await this.fetchDirs(dir.path, dir.level + 1)
},
selectSubDir(dir) {
async selectSubDir(dir) {
if (dir.isUsed) return
this.selectedPath = dir.path
this.selectedFullPath = dir.fullPath
this.level = dir.level
this.directories = this.subdirs
this.subdirs = dir.dirs
this.subdirs = await this.fetchDirs(dir.path, dir.level + 1)
},
selectFolder() {
if (!this.selectedPath) {
console.error('No Selected path')
return
}
if (this.paths.find((p) => p.startsWith(this.selectedFullPath))) {
if (this.paths.find((p) => p.startsWith(this.selectedPath))) {
this.$toast.error(`Oops, you cannot add a parent directory of a folder already added`)
return
}
this.$emit('select', this.selectedFullPath)
this.$emit('select', this.selectedPath)
this.selectedPath = ''
this.selectedFullPath = ''
},
fetchDirs(path, level) {
this.loadingDirs = true
return this.$axios
.$get(`/api/filesystem?path=${path}&level=${level}`)
.then((data) => {
console.log('Fetched directories', data.directories)
this.isPosix = !!data.posix
return data.directories
})
.catch((error) => {
console.error('Failed to get filesystem paths', error)
this.$toast.error('Failed to get filesystem paths')
return []
})
.finally(() => {
this.loadingDirs = false
})
},
async init() {
this.loadingFolders = true
this.allFolders = await this.$store.dispatch('libraries/loadFolders')
this.loadingFolders = false
this.initialLoad = true
this.rootDirs = await this.fetchDirs('', 0)
this.initialLoad = false
this.directories = this.allFolders
this.directories = this.rootDirs
this.subdirs = []
this.selectedPath = ''
this.selectedFullPath = ''
}
},
mounted() {

View File

@@ -63,7 +63,7 @@ export default {
},
audioMetatags: {
id: 'audioMetatags',
name: 'Audio file meta tags',
name: 'Audio file meta tags OR ebook metadata',
include: true
},
nfoFile: {

View File

@@ -68,7 +68,9 @@ export default {
selectAll: false,
search: null,
searchTimeout: null,
searchText: null
searchText: null,
downloadedEpisodeGuidMap: {},
downloadedEpisodeUrlMap: {}
}
},
watch: {
@@ -122,11 +124,13 @@ export default {
},
methods: {
getIsEpisodeDownloaded(episode) {
return this.itemEpisodes.some((downloadedEpisode) => {
if (episode.guid && downloadedEpisode.guid === episode.guid) return true
if (!downloadedEpisode.enclosure?.url) return false
return this.getCleanEpisodeUrl(downloadedEpisode.enclosure.url) === episode.cleanUrl
})
if (episode.guid && !!this.downloadedEpisodeGuidMap[episode.guid]) {
return true
}
if (this.downloadedEpisodeUrlMap[episode.cleanUrl]) {
return true
}
return false
},
/**
* UPDATE: As of v2.4.5 guid is used for matching existing downloaded episodes if it is found on the RSS feed.
@@ -219,6 +223,14 @@ export default {
})
},
init() {
this.downloadedEpisodeGuidMap = {}
this.downloadedEpisodeUrlMap = {}
this.itemEpisodes.forEach((episode) => {
if (episode.guid) this.downloadedEpisodeGuidMap[episode.guid] = episode.id
if (episode.enclosure?.url) this.downloadedEpisodeUrlMap[this.getCleanEpisodeUrl(episode.enclosure.url)] = episode.id
})
this.episodesCleaned = this.episodes
.filter((ep) => ep.enclosure?.url)
.map((_ep) => {

View File

@@ -1,7 +1,7 @@
<template>
<div class="w-full h-full">
<div v-show="showPageMenu" v-click-outside="clickOutside" class="pagemenu absolute top-9 left-8 rounded-md overflow-y-auto bg-bg shadow-lg z-20 border border-gray-400" :style="{ width: pageMenuWidth + 'px' }">
<div v-for="(file, index) in cleanedPageNames" :key="file" class="w-full cursor-pointer hover:bg-black-200 px-2 py-1" :class="page === index ? 'bg-black-200' : ''" @click="setPage(index + 1)">
<div v-for="(file, index) in cleanedPageNames" :key="file" class="w-full cursor-pointer hover:bg-black-200 px-2 py-1" :class="page === index + 1 ? 'bg-black-200' : ''" @click="setPage(index + 1)">
<p class="text-sm truncate">{{ file }}</p>
</div>
</div>

View File

@@ -12,7 +12,7 @@
</div>
</div>
<transition name="slide">
<div class="w-full" v-show="showFiles">
<div class="w-full" v-if="showFiles">
<table class="text-sm tracksTable">
<tr>
<th class="text-left px-4">{{ $strings.LabelPath }}</th>
@@ -70,7 +70,7 @@ export default {
},
audioFiles() {
if (this.libraryItem.mediaType === 'podcast') {
return this.libraryItem.media?.episodes.map((ep) => ep.audioFile) || []
return this.libraryItem.media?.episodes.map((ep) => ep.audioFile).filter((af) => af) || []
}
return this.libraryItem.media?.audioFiles || []
},

View File

@@ -1,18 +1,22 @@
<template>
<div class="w-full px-2 py-3 overflow-hidden relative border-b border-white border-opacity-10" @mouseover="mouseover" @mouseleave="mouseleave">
<div v-if="episode" class="flex items-center cursor-pointer" :class="{ 'opacity-70': isSelected || selectionMode }" @click="clickedEpisode">
<div class="flex-grow px-2">
<div :id="`lazy-episode-${index}`" class="w-full h-full cursor-pointer" @mouseover="mouseover" @mouseleave="mouseleave">
<div class="flex" @click="clickedEpisode">
<div class="flex-grow">
<div class="flex items-center">
<span class="text-sm font-semibold">{{ title }}</span>
<widgets-podcast-type-indicator :type="episode.episodeType" />
<span class="text-sm font-semibold">{{ episodeTitle }}</span>
<widgets-podcast-type-indicator :type="episodeType" />
</div>
<p class="text-sm text-gray-200 episode-subtitle mt-1.5 mb-0.5" v-html="subtitle"></p>
<div class="flex justify-between pt-2 max-w-xl">
<p v-if="episode.season" class="text-sm text-gray-300">Season #{{ episode.season }}</p>
<p v-if="episode.episode" class="text-sm text-gray-300">Episode #{{ episode.episode }}</p>
<p v-if="episode.chapters?.length" class="text-sm text-gray-300">{{ episode.chapters.length }} Chapters</p>
<p v-if="publishedAt" class="text-sm text-gray-300">Published {{ $formatDate(publishedAt, dateFormat) }}</p>
<div class="h-10 flex items-center mt-1.5 mb-0.5">
<p class="text-sm text-gray-200 episode-subtitle" v-html="episodeSubtitle"></p>
</div>
<div class="h-8 flex items-center">
<div class="w-full inline-flex justify-between max-w-xl">
<p v-if="episode?.season" class="text-sm text-gray-300">Season #{{ episode.season }}</p>
<p v-if="episode?.episode" class="text-sm text-gray-300">Episode #{{ episode.episode }}</p>
<p v-if="episode?.chapters?.length" class="text-sm text-gray-300">{{ episode.chapters.length }} Chapters</p>
<p v-if="publishedAt" class="text-sm text-gray-300">Published {{ $formatDate(publishedAt, dateFormat) }}</p>
</div>
</div>
<div class="flex items-center pt-2">
@@ -37,10 +41,11 @@
<ui-icon-btn v-if="userCanDelete" icon="close" borderless @click="removeClick" />
</div>
</div>
<div v-if="isHovering || isSelected || selectionMode" class="hidden md:block w-12 min-w-12" />
<div v-if="isHovering || isSelected || isSelectionMode" class="hidden md:block w-12 min-w-12" />
</div>
<div v-if="isSelected || selectionMode" class="absolute top-0 left-0 w-full h-full bg-black bg-opacity-10 z-10 cursor-pointer" @click.stop="clickedSelectionBg" />
<div class="hidden md:block md:w-12 md:min-w-12 md:-right-0 md:absolute md:top-0 h-full transform transition-transform z-20" :class="!isHovering && !isSelected && !selectionMode ? 'translate-x-24' : 'translate-x-0'">
<div v-if="isSelected || isSelectionMode" class="absolute top-0 left-0 w-full h-full bg-black bg-opacity-10 z-10 cursor-pointer" @click.stop="clickedSelectionBg" />
<div class="hidden md:block md:w-12 md:min-w-12 md:-right-0 md:absolute md:top-0 h-full transform transition-transform z-20" :class="!isHovering && !isSelected && !isSelectionMode ? 'translate-x-24' : 'translate-x-0'">
<div class="flex h-full items-center">
<div class="mx-1">
<ui-checkbox v-model="isSelected" @input="selectedUpdated" checkbox-bg="bg" />
@@ -55,84 +60,91 @@
<script>
export default {
props: {
index: Number,
libraryItemId: String,
episode: {
type: Object,
default: () => {}
},
selectionMode: Boolean
default: () => null
}
},
data() {
return {
isProcessingReadUpdate: false,
processingRemove: false,
isHovering: false,
isSelected: false
isSelected: false,
isSelectionMode: false
}
},
computed: {
store() {
return this.$store || this.$nuxt.$store
},
axios() {
return this.$axios || this.$nuxt.$axios
},
userCanUpdate() {
return this.$store.getters['user/getUserCanUpdate']
return this.store.getters['user/getUserCanUpdate']
},
userCanDelete() {
return this.$store.getters['user/getUserCanDelete']
return this.store.getters['user/getUserCanDelete']
},
audioFile() {
return this.episode.audioFile
episodeId() {
return this.episode?.id || ''
},
title() {
return this.episode.title || ''
episodeTitle() {
return this.episode?.title || ''
},
subtitle() {
return this.episode.subtitle || this.description
episodeSubtitle() {
return this.episode?.subtitle || ''
},
description() {
return this.episode.description || ''
episodeType() {
return this.episode?.episodeType || ''
},
duration() {
return this.$secondsToTimestamp(this.episode.duration)
publishedAt() {
return this.episode?.publishedAt
},
libraryItemIdStreaming() {
return this.$store.getters['getLibraryItemIdStreaming']
},
isStreamingFromDifferentLibrary() {
return this.$store.getters['getIsStreamingFromDifferentLibrary']
},
isStreaming() {
return this.$store.getters['getIsMediaStreaming'](this.libraryItemId, this.episode.id)
},
isQueued() {
return this.$store.getters['getIsMediaQueued'](this.libraryItemId, this.episode.id)
},
streamIsPlaying() {
return this.$store.state.streamIsPlaying && this.isStreaming
dateFormat() {
return this.store.state.serverSettings.dateFormat
},
itemProgress() {
return this.$store.getters['user/getUserMediaProgress'](this.libraryItemId, this.episode.id)
return this.store.getters['user/getUserMediaProgress'](this.libraryItemId, this.episodeId)
},
itemProgressPercent() {
return this.itemProgress ? this.itemProgress.progress : 0
return this.itemProgress?.progress || 0
},
userIsFinished() {
return this.itemProgress ? !!this.itemProgress.isFinished : false
return !!this.itemProgress?.isFinished
},
libraryItemIdStreaming() {
return this.store.getters['getLibraryItemIdStreaming']
},
isStreamingFromDifferentLibrary() {
return this.store.getters['getIsStreamingFromDifferentLibrary']
},
isStreaming() {
return this.store.getters['getIsMediaStreaming'](this.libraryItemId, this.episodeId)
},
isQueued() {
return this.store.getters['getIsMediaQueued'](this.libraryItemId, this.episodeId)
},
streamIsPlaying() {
return this.store.state.streamIsPlaying && this.isStreaming
},
timeRemaining() {
if (this.streamIsPlaying) return 'Playing'
if (!this.itemProgress) return this.$elapsedPretty(this.episode.duration)
if (!this.itemProgress) return this.$elapsedPretty(this.episode?.duration || 0)
if (this.userIsFinished) return 'Finished'
var remaining = Math.floor(this.itemProgress.duration - this.itemProgress.currentTime)
const duration = this.itemProgress.duration || this.episode?.duration || 0
const remaining = Math.floor(duration - this.itemProgress.currentTime)
return `${this.$elapsedPretty(remaining)} left`
},
publishedAt() {
return this.episode.publishedAt
},
dateFormat() {
return this.$store.state.serverSettings.dateFormat
}
},
methods: {
clickAddToPlaylist() {
this.$emit('addToPlaylist', this.episode)
setSelectionMode(isSelectionMode) {
this.isSelectionMode = isSelectionMode
if (!this.isSelectionMode) this.isSelected = false
},
clickedEpisode() {
this.$emit('view', this.episode)
@@ -150,16 +162,23 @@ export default {
mouseleave() {
this.isHovering = false
},
clickEdit() {
this.$emit('edit', this.episode)
},
playClick() {
if (this.streamIsPlaying) {
this.$eventBus.$emit('pause-item')
const eventBus = this.$eventBus || this.$nuxt.$eventBus
eventBus.$emit('pause-item')
} else {
this.$emit('play', this.episode)
}
},
queueBtnClick() {
if (this.isQueued) {
// Remove from queue
this.store.commit('removeItemFromQueue', { libraryItemId: this.libraryItemId, episodeId: this.episodeId })
} else {
// Add to queue
this.$emit('addToQueue', this.episode)
}
},
toggleFinished(confirmed = false) {
if (!this.userIsFinished && this.itemProgressPercent > 0 && !confirmed) {
const payload = {
@@ -171,37 +190,47 @@ export default {
},
type: 'yesNo'
}
this.$store.commit('globals/setConfirmPrompt', payload)
this.store.commit('globals/setConfirmPrompt', payload)
return
}
var updatePayload = {
const updatePayload = {
isFinished: !this.userIsFinished
}
this.isProcessingReadUpdate = true
this.$axios
.$patch(`/api/me/progress/${this.libraryItemId}/${this.episode.id}`, updatePayload)
this.axios
.$patch(`/api/me/progress/${this.libraryItemId}/${this.episodeId}`, updatePayload)
.then(() => {
this.isProcessingReadUpdate = false
})
.catch((error) => {
console.error('Failed', error)
this.isProcessingReadUpdate = false
this.$toast.error(updatePayload.isFinished ? this.$strings.ToastItemMarkedAsFinishedFailed : this.$strings.ToastItemMarkedAsNotFinishedFailed)
const toast = this.$toast || this.$nuxt.$toast
toast.error(updatePayload.isFinished ? this.$strings.ToastItemMarkedAsFinishedFailed : this.$strings.ToastItemMarkedAsNotFinishedFailed)
})
},
clickAddToPlaylist() {
this.$emit('addToPlaylist', this.episode)
},
clickEdit() {
this.$emit('edit', this.episode)
},
removeClick() {
this.$emit('remove', this.episode)
},
queueBtnClick() {
if (this.isQueued) {
// Remove from queue
this.$store.commit('removeItemFromQueue', { libraryItemId: this.libraryItemId, episodeId: this.episode.id })
} else {
// Add to queue
this.$emit('addToQueue', this.episode)
destroy() {
// destroy the vue listeners, etc
this.$destroy()
// remove the element from the DOM
if (this.$el && this.$el.parentNode) {
this.$el.parentNode.removeChild(this.$el)
} else if (this.$el && this.$el.remove) {
this.$el.remove()
}
}
}
},
mounted() {}
}
</script>
</script>

View File

@@ -1,5 +1,5 @@
<template>
<div class="w-full py-6">
<div id="lazy-episodes-table" class="w-full py-6">
<div class="flex flex-wrap flex-col md:flex-row md:items-center mb-4">
<div class="flex items-center flex-nowrap whitespace-nowrap mb-2 md:mb-0">
<p class="text-lg mb-0 font-semibold">{{ $strings.HeaderEpisodes }}</p>
@@ -18,28 +18,41 @@
<ui-btn :disabled="processing" small class="ml-2 h-9" @click="clearSelected">{{ $strings.ButtonCancel }}</ui-btn>
</template>
<template v-else>
<controls-filter-select v-model="filterKey" :items="filterItems" class="w-36 h-9 md:ml-4" />
<controls-sort-select v-model="sortKey" :descending.sync="sortDesc" :items="sortItems" class="w-44 md:w-48 h-9 ml-1 sm:ml-4" />
<controls-filter-select v-model="filterKey" :items="filterItems" class="w-36 h-9 md:ml-4" @change="filterSortChanged" />
<controls-sort-select v-model="sortKey" :descending.sync="sortDesc" :items="sortItems" class="w-44 md:w-48 h-9 ml-1 sm:ml-4" @change="filterSortChanged" />
<div class="flex-grow md:hidden" />
<ui-context-menu-dropdown v-if="contextMenuItems.length" :items="contextMenuItems" class="ml-1" @action="contextMenuAction" />
</template>
</div>
</div>
<p v-if="!episodes.length" class="py-4 text-center text-lg">{{ $strings.MessageNoEpisodes }}</p>
<!-- <p v-if="!episodes.length" class="py-4 text-center text-lg">{{ $strings.MessageNoEpisodes }}</p> -->
<div v-if="episodes.length" class="w-full py-3 mx-auto flex">
<form @submit.prevent="submit" class="flex flex-grow">
<ui-text-input v-model="search" @input="inputUpdate" type="search" :placeholder="$strings.PlaceholderSearchEpisode" class="flex-grow mr-2 text-sm md:text-base" />
</form>
</div>
<template v-for="episode in episodesList">
<tables-podcast-episode-table-row ref="episodeRow" :key="episode.id" :episode="episode" :library-item-id="libraryItem.id" :selection-mode="isSelectionMode" class="item" @play="playEpisode" @remove="removeEpisode" @edit="editEpisode" @view="viewEpisode" @selected="episodeSelected" @addToQueue="addEpisodeToQueue" @addToPlaylist="addToPlaylist" />
</template>
<div class="relative min-h-[176px]">
<template v-for="episode in totalEpisodes">
<div :key="episode" :id="`episode-${episode - 1}`" class="w-full h-44 px-2 py-3 overflow-hidden relative border-b border-white/10">
<!-- episode is mounted here -->
</div>
</template>
<div v-if="isSearching" class="w-full h-full absolute inset-0 flex justify-center py-12" :class="{ 'bg-black/50': totalEpisodes }">
<ui-loading-indicator />
</div>
<div v-else-if="!totalEpisodes" class="h-44 flex items-center justify-center">
<p class="text-lg">{{ $strings.MessageNoEpisodes }}</p>
</div>
</div>
<modals-podcast-remove-episode v-model="showPodcastRemoveModal" @input="removeEpisodeModalToggled" :library-item="libraryItem" :episodes="episodesToRemove" @clearSelected="clearSelected" />
</div>
</template>
<script>
import Vue from 'vue'
import LazyEpisodeRow from './LazyEpisodeRow.vue'
export default {
props: {
libraryItem: {
@@ -60,13 +73,21 @@ export default {
processing: false,
search: null,
searchTimeout: null,
searchText: null
searchText: null,
isSearching: false,
totalEpisodes: 0,
episodesPerPage: null,
episodeIndexesMounted: [],
episodeComponentRefs: {},
windowHeight: 0,
episodesTableOffsetTop: 0,
episodeRowHeight: 176
}
},
watch: {
libraryItem: {
handler() {
this.init()
this.refresh()
}
}
},
@@ -194,13 +215,19 @@ export default {
submit() {},
inputUpdate() {
clearTimeout(this.searchTimeout)
this.isSearching = true
let searchStart = this.searchText
this.searchTimeout = setTimeout(() => {
if (!this.search || !this.search.trim()) {
this.isSearching = false
if (!this.search?.trim()) {
this.searchText = ''
return
} else {
this.searchText = this.search.toLowerCase().trim()
}
this.searchText = this.search.toLowerCase().trim()
}, 500)
if (searchStart !== this.searchText) {
this.init()
}
}, 750)
},
contextMenuAction({ action }) {
if (action === 'quick-match-episodes') {
@@ -304,24 +331,30 @@ export default {
if (!val) this.episodesToRemove = []
},
clearSelected() {
const episodeRows = this.$refs.episodeRow
if (episodeRows && episodeRows.length) {
for (const epRow of episodeRows) {
if (epRow) epRow.isSelected = false
}
}
this.selectedEpisodes = []
this.setSelectionModeForEpisodes()
},
removeSelectedEpisodes() {
this.episodesToRemove = this.selectedEpisodes
this.showPodcastRemoveModal = true
},
episodeSelected({ isSelected, episode }) {
let isSelectionModeBefore = this.isSelectionMode
if (isSelected) {
this.selectedEpisodes.push(episode)
} else {
this.selectedEpisodes = this.selectedEpisodes.filter((ep) => ep.id !== episode.id)
}
if (this.isSelectionMode !== isSelectionModeBefore) {
this.setSelectionModeForEpisodes()
}
},
setSelectionModeForEpisodes() {
for (const key in this.episodeComponentRefs) {
if (this.episodeComponentRefs[key]?.setSelectionMode) {
this.episodeComponentRefs[key].setSelectionMode(this.isSelectionMode)
}
}
},
playEpisode(episode) {
const queueItems = []
@@ -367,12 +400,147 @@ export default {
this.$store.commit('globals/setSelectedEpisode', episode)
this.$store.commit('globals/setShowViewPodcastEpisodeModal', true)
},
init() {
destroyEpisodeComponents() {
for (const key in this.episodeComponentRefs) {
if (this.episodeComponentRefs[key]?.destroy) {
this.episodeComponentRefs[key].destroy()
}
}
this.episodeComponentRefs = {}
this.episodeIndexesMounted = []
},
mountEpisode(index) {
const episodeEl = document.getElementById(`episode-${index}`)
if (!episodeEl) {
console.warn('Episode row el not found at ' + index)
return
}
this.episodeIndexesMounted.push(index)
if (this.episodeComponentRefs[index]) {
const episodeComponent = this.episodeComponentRefs[index]
episodeEl.appendChild(episodeComponent.$el)
if (this.isSelectionMode) {
episodeComponent.setSelectionMode(true)
if (this.selectedEpisodes.some((i) => i.id === episodeComponent.episodeId)) {
episodeComponent.isSelected = true
} else {
episodeComponent.isSelected = false
}
} else {
episodeComponent.setSelectionMode(false)
}
} else {
const _this = this
const ComponentClass = Vue.extend(LazyEpisodeRow)
const instance = new ComponentClass({
propsData: {
index,
libraryItemId: this.libraryItem.id,
episode: this.episodesList[index]
},
created() {
this.$on('selected', (payload) => {
_this.episodeSelected(payload)
})
this.$on('view', (payload) => {
_this.viewEpisode(payload)
})
this.$on('play', (payload) => {
_this.playEpisode(payload)
})
this.$on('addToQueue', (payload) => {
_this.addEpisodeToQueue(payload)
})
this.$on('remove', (payload) => {
_this.removeEpisode(payload)
})
this.$on('edit', (payload) => {
_this.editEpisode(payload)
})
this.$on('addToPlaylist', (payload) => {
_this.addToPlaylist(payload)
})
}
})
this.episodeComponentRefs[index] = instance
instance.$mount()
episodeEl.appendChild(instance.$el)
if (this.isSelectionMode) {
instance.setSelectionMode(true)
if (this.selectedEpisodes.some((i) => i.id === this.episodesList[index].id)) {
instance.isSelected = true
}
}
}
},
mountEpisodes(startIndex, endIndex) {
for (let i = startIndex; i < endIndex; i++) {
if (!this.episodeIndexesMounted.includes(i)) {
this.mountEpisode(i)
}
}
},
scroll(evt) {
if (!evt?.target?.scrollTop) return
const scrollTop = Math.max(evt.target.scrollTop - this.episodesTableOffsetTop, 0)
let firstEpisodeIndex = Math.floor(scrollTop / this.episodeRowHeight)
let lastEpisodeIndex = Math.ceil((scrollTop + this.windowHeight) / this.episodeRowHeight)
lastEpisodeIndex = Math.min(this.totalEpisodes - 1, lastEpisodeIndex)
this.episodeIndexesMounted = this.episodeIndexesMounted.filter((_index) => {
if (_index < firstEpisodeIndex || _index >= lastEpisodeIndex) {
const el = document.getElementById(`lazy-episode-${_index}`)
if (el) el.remove()
return false
}
return true
})
this.mountEpisodes(firstEpisodeIndex, lastEpisodeIndex + 1)
},
initListeners() {
const itemPageWrapper = document.getElementById('item-page-wrapper')
if (itemPageWrapper) {
itemPageWrapper.addEventListener('scroll', this.scroll)
}
},
removeListeners() {
const itemPageWrapper = document.getElementById('item-page-wrapper')
if (itemPageWrapper) {
itemPageWrapper.removeEventListener('scroll', this.scroll)
}
},
filterSortChanged() {
this.init()
},
refresh() {
this.episodesCopy = this.episodes.map((ep) => ({ ...ep }))
this.init()
},
init() {
this.destroyEpisodeComponents()
this.totalEpisodes = this.episodesList.length
const lazyEpisodesTableEl = document.getElementById('lazy-episodes-table')
this.episodesTableOffsetTop = (lazyEpisodesTableEl?.offsetTop || 0) + 64
this.windowHeight = window.innerHeight
this.episodesPerPage = Math.ceil(this.windowHeight / this.episodeRowHeight)
this.$nextTick(() => {
this.mountEpisodes(0, Math.min(this.episodesPerPage, this.totalEpisodes))
})
}
},
mounted() {
this.episodesCopy = this.episodes.map((ep) => ({ ...ep }))
this.initListeners()
this.init()
},
beforeDestroy() {
this.removeListeners()
}
}
</script>

View File

@@ -15,6 +15,13 @@ export default {
type: String,
default: 'right'
},
/**
* Delay showing the tooltip after X milliseconds of hovering
*/
delayOnShow: {
type: Number,
default: 0
},
disabled: Boolean
},
data() {
@@ -22,7 +29,8 @@ export default {
tooltip: null,
tooltipId: null,
isShowing: false,
hideTimeout: null
hideTimeout: null,
delayOnShowTimeout: null
}
},
watch: {
@@ -59,29 +67,44 @@ export default {
this.tooltip = tooltip
},
setTooltipPosition(tooltip) {
var boxChow = this.$refs.box.getBoundingClientRect()
const boxRect = this.$refs.box.getBoundingClientRect()
const shouldMount = !tooltip.isConnected
var shouldMount = !tooltip.isConnected
// Calculate size of tooltip
if (shouldMount) document.body.appendChild(tooltip)
var { width, height } = tooltip.getBoundingClientRect()
const tooltipRect = tooltip.getBoundingClientRect()
if (shouldMount) tooltip.remove()
var top = 0
var left = 0
// Subtracting scrollbar size
const windowHeight = window.innerHeight - 8
const windowWidth = window.innerWidth - 8
let top = 0
let left = 0
if (this.direction === 'right') {
top = boxChow.top - height / 2 + boxChow.height / 2
left = boxChow.left + boxChow.width + 4
top = Math.max(0, boxRect.top - tooltipRect.height / 2 + boxRect.height / 2)
left = Math.max(0, boxRect.left + boxRect.width + 4)
} else if (this.direction === 'bottom') {
top = boxChow.top + boxChow.height + 4
left = boxChow.left - width / 2 + boxChow.width / 2
top = Math.max(0, boxRect.top + boxRect.height + 4)
left = Math.max(0, boxRect.left - tooltipRect.width / 2 + boxRect.width / 2)
} else if (this.direction === 'top') {
top = boxChow.top - height - 4
left = boxChow.left - width / 2 + boxChow.width / 2
top = Math.max(0, boxRect.top - tooltipRect.height - 4)
left = Math.max(0, boxRect.left - tooltipRect.width / 2 + boxRect.width / 2)
} else if (this.direction === 'left') {
top = boxChow.top - height / 2 + boxChow.height / 2
left = boxChow.left - width - 4
top = Math.max(0, boxRect.top - tooltipRect.height / 2 + boxRect.height / 2)
left = Math.max(0, boxRect.left - tooltipRect.width - 4)
}
// Shift left if tooltip would overflow the window on the right
if (left + tooltipRect.width > windowWidth) {
left -= left + tooltipRect.width - windowWidth
}
// Shift up if tooltip would overflow the window on the bottom
if (top + tooltipRect.height > windowHeight) {
top -= top + tooltipRect.height - windowHeight
}
tooltip.style.top = top + 'px'
tooltip.style.left = left + 'px'
},
@@ -107,15 +130,33 @@ export default {
this.isShowing = false
},
cancelHide() {
if (this.hideTimeout) clearTimeout(this.hideTimeout)
clearTimeout(this.hideTimeout)
},
mouseover() {
if (!this.isShowing) this.showTooltip()
if (this.isShowing || this.disabled) return
if (this.delayOnShow) {
if (this.delayOnShowTimeout) {
// Delay already running
return
}
this.delayOnShowTimeout = setTimeout(() => {
this.showTooltip()
this.delayOnShowTimeout = null
}, this.delayOnShow)
} else {
this.showTooltip()
}
},
mouseleave() {
if (this.isShowing) {
this.hideTimeout = setTimeout(this.hideTooltip, 100)
if (!this.isShowing) {
clearTimeout(this.delayOnShowTimeout)
this.delayOnShowTimeout = null
return
}
this.hideTimeout = setTimeout(this.hideTooltip, 100)
}
},
beforeDestroy() {

View File

@@ -1,12 +1,12 @@
{
"name": "audiobookshelf-client",
"version": "2.7.0",
"version": "2.7.2",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "audiobookshelf-client",
"version": "2.7.0",
"version": "2.7.2",
"license": "ISC",
"dependencies": {
"@nuxtjs/axios": "^5.13.6",

View File

@@ -1,6 +1,6 @@
{
"name": "audiobookshelf-client",
"version": "2.7.0",
"version": "2.7.2",
"buildNumber": 1,
"description": "Self-hosted audiobook and podcast client",
"main": "index.js",

View File

@@ -178,9 +178,9 @@
</a>
<p class="pl-4 pr-2 text-sm text-yellow-400">
{{ $strings.MessageJoinUsOn }}
<a class="underline" href="https://discord.gg/pJsjuNCKRq" target="_blank">discord</a>
<a class="underline" href="https://discord.gg/HQgCbd6E75" target="_blank">discord</a>
</p>
<a href="https://discord.gg/pJsjuNCKRq" target="_blank" class="text-white hover:text-gray-200 hover:scale-150 hover:rotate-6 transform duration-500">
<a href="https://discord.gg/HQgCbd6E75" target="_blank" class="text-white hover:text-gray-200 hover:scale-150 hover:rotate-6 transform duration-500">
<svg width="31" height="24" viewBox="0 0 71 55" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0)">
<path

View File

@@ -1,6 +1,6 @@
<template>
<div id="page-wrapper" class="bg-bg page overflow-hidden" :class="streamLibraryItem ? 'streaming' : ''">
<div class="w-full h-full overflow-y-auto px-2 py-6 lg:p-8">
<div id="item-page-wrapper" class="w-full h-full overflow-y-auto px-2 py-6 lg:p-8">
<div class="flex flex-col lg:flex-row max-w-6xl mx-auto">
<div class="w-full flex justify-center lg:block lg:w-52" style="min-width: 208px">
<div class="relative group" style="height: fit-content">
@@ -136,7 +136,7 @@
<widgets-audiobook-data v-if="tracks.length" :library-item-id="libraryItemId" :is-file="isFile" :media="media" />
<tables-podcast-episodes-table v-if="isPodcast" :library-item="libraryItem" />
<tables-podcast-lazy-episodes-table v-if="isPodcast" :library-item="libraryItem" />
<tables-chapters-table v-if="chapters.length" :library-item="libraryItem" class="mt-6" />

View File

@@ -54,9 +54,16 @@
<p class="pl-2 pr-1 text-sm font-semibold">{{ getButtonText(episode) }}</p>
</button>
<button v-if="libraryItemIdStreaming && !isStreamingFromDifferentLibrary" class="h-8 w-8 flex justify-center items-center mx-2" :class="playerQueueEpisodeIdMap[episode.id] ? 'text-success' : ''" @click.stop="queueBtnClick(episode)">
<span class="material-icons-outlined text-2xl">{{ playerQueueEpisodeIdMap[episode.id] ? 'playlist_add_check' : 'playlist_add' }}</span>
</button>
<ui-tooltip v-if="libraryItemIdStreaming && !isStreamingFromDifferentLibrary" :text="playerQueueEpisodeIdMap[episode.id] ? $strings.MessageRemoveFromPlayerQueue : $strings.MessageAddToPlayerQueue" :class="playerQueueEpisodeIdMap[episode.id] ? 'text-success' : ''" direction="top">
<ui-icon-btn :icon="playerQueueEpisodeIdMap[episode.id] ? 'playlist_add_check' : 'playlist_play'" borderless @click="queueBtnClick(episode)" />
<!-- <button class="h-8 w-8 flex justify-center items-center mx-2" :class="playerQueueEpisodeIdMap[episode.id] ? 'text-success' : ''" @click.stop="queueBtnClick(episode)">
<span class="material-icons-outlined text-2xl">{{ playerQueueEpisodeIdMap[episode.id] ? 'playlist_add_check' : 'playlist_add' }}</span>
</button> -->
</ui-tooltip>
<ui-tooltip :text="$strings.LabelYourPlaylists" direction="top">
<ui-icon-btn icon="playlist_add" borderless @click="clickAddToPlaylist(episode)" />
</ui-tooltip>
</div>
</div>
@@ -136,6 +143,15 @@ export default {
}
},
methods: {
clickAddToPlaylist(episode) {
// Makeshift libraryItem
const libraryItem = {
id: episode.libraryItemId,
media: episode.podcast
}
this.$store.commit('globals/setSelectedPlaylistItems', [{ libraryItem: libraryItem, episode }])
this.$store.commit('globals/setShowPlaylistsModal', true)
},
async clickEpisode(episode) {
if (this.openingItem) return
this.openingItem = true
@@ -155,7 +171,9 @@ export default {
if (this.episodeIdStreaming === episode.id) return this.streamIsPlaying ? 'Streaming' : 'Play'
if (!episode.progress) return this.$elapsedPretty(episode.duration)
if (episode.progress.isFinished) return 'Finished'
var remaining = Math.floor(episode.progress.duration - episode.progress.currentTime)
const duration = episode.progress.duration || episode.duration
const remaining = Math.floor(duration - episode.progress.currentTime)
return `${this.$elapsedPretty(remaining)} left`
},
playClick(episodeToPlay) {

View File

@@ -80,13 +80,11 @@ export const actions = {
return state.folders
}
}
console.log('Loading folders')
commit('setFoldersLastUpdate')
return this.$axios
.$get('/api/filesystem')
.then((res) => {
console.log('Settings folders', res)
commit('setFolders', res.directories)
return res.directories
})
@@ -119,15 +117,16 @@ export const actions = {
dispatch('user/checkUpdateLibrarySortFilter', library.mediaType, { root: true })
if (libraryChanging) {
commit('setCollections', [])
commit('setUserPlaylists', [])
}
commit('addUpdate', library)
commit('setLibraryIssues', issues)
commit('setLibraryFilterData', filterData)
commit('setNumUserPlaylists', numUserPlaylists)
commit('setCurrentLibrary', libraryId)
if (libraryChanging) {
commit('setCollections', [])
commit('setUserPlaylists', [])
}
return data
})
.catch((error) => {

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Uživatel root je jediný uživatel, který může mít prázdné heslo",
"NoteChapterEditorTimes": "Poznámka: Čas začátku první kapitoly musí zůstat v 0:00 a čas začátku poslední kapitoly nesmí překročit tuto dobu trvání audioknihy.",
"NoteFolderPicker": "Poznámka: složky, které jsou již namapovány, nebudou zobrazeny",
"NoteFolderPickerDebian": "Poznámka: Výběr složek pro instalaci debianu není plně implementován. Cestu ke své knihovně byste měli zadat přímo.",
"NoteRSSFeedPodcastAppsHttps": "Upozornění: Většina aplikací pro podcasty bude vyžadovat, aby adresa URL kanálu RSS používala protokol HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Upozornění: 1 nebo více epizod nemá datum vydání. Některé podcastové aplikace to vyžadují.",
"NoteUploaderFoldersWithMediaFiles": "Se složkami s multimediálními soubory bude zacházeno jako se samostatnými položkami knihovny.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root-brugeren er den eneste bruger, der kan have en tom adgangskode",
"NoteChapterEditorTimes": "Bemærk: Første kapitel starttidspunkt skal forblive kl. 0:00, og det sidste kapitel starttidspunkt må ikke overstige denne lydbogs varighed.",
"NoteFolderPicker": "Bemærk: Mapper, der allerede er mappet, vises ikke",
"NoteFolderPickerDebian": "Bemærk: Mappicker for Debian-installationen er ikke fuldt implementeret. Du bør indtaste stien til dit bibliotek direkte.",
"NoteRSSFeedPodcastAppsHttps": "Advarsel: De fleste podcast-apps kræver, at RSS-feedets URL bruger HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Advarsel: En eller flere af dine episoder har ikke en Pub Date. Nogle podcast-apps kræver dette.",
"NoteUploaderFoldersWithMediaFiles": "Mapper med mediefiler håndteres som separate bibliotekselementer.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Der Root-Benutzer (Hauptbenutzer) ist der einzige Benutzer, der ein leeres Passwort haben kann",
"NoteChapterEditorTimes": "Hinweis: Die Anfangszeit des ersten Kapitels muss bei 0:00 beginnen und die Anfangszeit des letzten Kapitels darf die Dauer des Mediums nicht überschreiten.",
"NoteFolderPicker": "Hinweis: Bereits zugeordnete Ordner werden nicht angezeigt.",
"NoteFolderPickerDebian": "Hinweis: Der Ordnerauswahldialog für die Debian-Installation ist nicht vollständig implementiert. Sie sollten den Pfad zu Ihrer Bibliothek direkt eingeben.",
"NoteRSSFeedPodcastAppsHttps": "Warnung: Die meisten Podcast-Apps verlangen, dass die URL des RSS-Feeds HTTPS verwendet.",
"NoteRSSFeedPodcastAppsPubDate": "Warnung: 1 oder mehrere Ihrer Episoden haben kein Veröffentlichungsdatum. Einige Podcast-Apps verlangen dies.",
"NoteUploaderFoldersWithMediaFiles": "Ordner mit Mediendateien werden als separate Bibliothekselemente behandelt.",
@@ -750,4 +749,4 @@
"ToastSocketFailedToConnect": "Verbindung zum WebSocket fehlgeschlagen",
"ToastUserDeleteFailed": "Benutzer konnte nicht gelöscht werden",
"ToastUserDeleteSuccess": "Benutzer gelöscht"
}
}

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root user is the only user that can have an empty password",
"NoteChapterEditorTimes": "Note: First chapter start time must remain at 0:00 and the last chapter start time cannot exceed this audiobooks duration.",
"NoteFolderPicker": "Note: folders already mapped will not be shown",
"NoteFolderPickerDebian": "Note: Folder picker for the debian install is not fully implemented. You should enter the path to your library directly.",
"NoteRSSFeedPodcastAppsHttps": "Warning: Most podcast apps will require the RSS feed URL is using HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Warning: 1 or more of your episodes do not have a Pub Date. Some podcast apps require this.",
"NoteUploaderFoldersWithMediaFiles": "Folders with media files will be handled as separate library items.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "El usuario Root es el único usuario que puede no tener una contraseña",
"NoteChapterEditorTimes": "Nota: El tiempo de inicio del primer capítulo debe permanecer en 0:00, y el tiempo de inicio del último capítulo no puede exceder la duración del audiolibro.",
"NoteFolderPicker": "Nota: Las carpetas ya asignadas no se mostrarán",
"NoteFolderPickerDebian": "Nota: El selector de archivos no está completamente implementado para instalaciones en Debian. Deberá ingresar la ruta de la carpeta de su biblioteca directamente.",
"NoteRSSFeedPodcastAppsHttps": "Advertencia: La mayoría de las aplicaciones de podcast requieren que la URL de la fuente RSS use HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Advertencia: 1 o más de sus episodios no tienen fecha de publicación. Algunas aplicaciones de podcast lo requieren.",
"NoteUploaderFoldersWithMediaFiles": "Las carpetas con archivos multimedia se manejarán como elementos separados en la biblioteca.",

View File

@@ -1,10 +1,10 @@
{
"ButtonAdd": "Ajouter",
"ButtonAddChapters": "Ajouter le chapitre",
"ButtonAddDevice": "Add Device",
"ButtonAddLibrary": "Add Library",
"ButtonAddDevice": "Ajouter un appareil",
"ButtonAddLibrary": "Ajouter une bibliothèque",
"ButtonAddPodcasts": "Ajouter des podcasts",
"ButtonAddUser": "Add User",
"ButtonAddUser": "Ajouter un utilisateur",
"ButtonAddYourFirstLibrary": "Ajouter votre première bibliothèque",
"ButtonApply": "Appliquer",
"ButtonApplyChapters": "Appliquer les chapitres",
@@ -62,7 +62,7 @@
"ButtonRemoveSeriesFromContinueSeries": "Ne plus continuer à écouter la série",
"ButtonReScan": "Nouvelle analyse",
"ButtonReset": "Réinitialiser",
"ButtonResetToDefault": "Reset to default",
"ButtonResetToDefault": "Réinitialiser aux valeurs par défaut",
"ButtonRestore": "Rétablir",
"ButtonSave": "Sauvegarder",
"ButtonSaveAndClose": "Sauvegarder et Fermer",
@@ -87,9 +87,9 @@
"ButtonUserEdit": "Modifier lutilisateur {0}",
"ButtonViewAll": "Afficher tout",
"ButtonYes": "Oui",
"ErrorUploadFetchMetadataAPI": "Error fetching metadata",
"ErrorUploadFetchMetadataNoResults": "Could not fetch metadata - try updating title and/or author",
"ErrorUploadLacksTitle": "Must have a title",
"ErrorUploadFetchMetadataAPI": "Erreur lors de la récupération des métadonnées",
"ErrorUploadFetchMetadataNoResults": "Impossible de récupérer les métadonnées - essayez de mettre à jour le titre et/ou lauteur.",
"ErrorUploadLacksTitle": "Doit avoir un titre",
"HeaderAccount": "Compte",
"HeaderAdvanced": "Avancé",
"HeaderAppriseNotificationSettings": "Configuration des Notifications Apprise",
@@ -101,7 +101,7 @@
"HeaderChapters": "Chapitres",
"HeaderChooseAFolder": "Choisir un dossier",
"HeaderCollection": "Collection",
"HeaderCollectionItems": "Entrées de la Collection",
"HeaderCollectionItems": "Entrées de la collection",
"HeaderCover": "Couverture",
"HeaderCurrentDownloads": "Téléchargements en cours",
"HeaderDetails": "Détails",
@@ -114,10 +114,10 @@
"HeaderEreaderSettings": "Options Ereader",
"HeaderFiles": "Fichiers",
"HeaderFindChapters": "Trouver les chapitres",
"HeaderIgnoredFiles": "Fichiers Ignorés",
"HeaderItemFiles": "Fichiers des Articles",
"HeaderIgnoredFiles": "Fichiers ignorés",
"HeaderItemFiles": "Fichiers des articles",
"HeaderItemMetadataUtils": "Outils de gestion des métadonnées",
"HeaderLastListeningSession": "Dernière Session découte",
"HeaderLastListeningSession": "Dernière session découte",
"HeaderLatestEpisodes": "Dernier épisodes",
"HeaderLibraries": "Bibliothèque",
"HeaderLibraryFiles": "Fichier de bibliothèque",
@@ -130,15 +130,15 @@
"HeaderManageTags": "Gérer les étiquettes",
"HeaderMapDetails": "Édition en masse",
"HeaderMatch": "Chercher",
"HeaderMetadataOrderOfPrecedence": "Metadata order of precedence",
"HeaderMetadataToEmbed": "Métadonnée à intégrer",
"HeaderMetadataOrderOfPrecedence": "Ordre de priorité des métadonnées",
"HeaderMetadataToEmbed": "Métadonnées à intégrer",
"HeaderNewAccount": "Nouveau compte",
"HeaderNewLibrary": "Nouvelle bibliothèque",
"HeaderNotifications": "Notifications",
"HeaderOpenIDConnectAuthentication": "OpenID Connect Authentication",
"HeaderOpenRSSFeed": "Ouvrir Flux RSS",
"HeaderOpenIDConnectAuthentication": "Authentification via OpenID Connect",
"HeaderOpenRSSFeed": "Ouvrir un flux RSS",
"HeaderOtherFiles": "Autres fichiers",
"HeaderPasswordAuthentication": "Password Authentication",
"HeaderPasswordAuthentication": "Authentification par mot de passe",
"HeaderPermissions": "Permissions",
"HeaderPlayerQueue": "Liste découte",
"HeaderPlaylist": "Liste de lecture",
@@ -154,7 +154,7 @@
"HeaderSchedule": "Programmation",
"HeaderScheduleLibraryScans": "Analyse automatique de la bibliothèque",
"HeaderSession": "Session",
"HeaderSetBackupSchedule": "Activer la Sauvegarde Automatique",
"HeaderSetBackupSchedule": "Activer la sauvegarde automatique",
"HeaderSettings": "Paramètres",
"HeaderSettingsDisplay": "Affichage",
"HeaderSettingsExperimental": "Fonctionnalités expérimentales",
@@ -187,11 +187,11 @@
"LabelAddToCollectionBatch": "Ajout de {0} livres à la lollection",
"LabelAddToPlaylist": "Ajouter à la liste de lecture",
"LabelAddToPlaylistBatch": "{0} éléments ajoutés à la liste de lecture",
"LabelAdminUsersOnly": "Admin users only",
"LabelAdminUsersOnly": "Administrateurs uniquement",
"LabelAll": "Tout",
"LabelAllUsers": "Tous les utilisateurs",
"LabelAllUsersExcludingGuests": "All users excluding guests",
"LabelAllUsersIncludingGuests": "All users including guests",
"LabelAllUsersExcludingGuests": "Tous les utilisateurs à lexception des invités",
"LabelAllUsersIncludingGuests": "Tous les utilisateurs, y compris les invités",
"LabelAlreadyInYourLibrary": "Déjà dans la bibliothèque",
"LabelAppend": "Ajouter",
"LabelAuthor": "Auteur",
@@ -199,29 +199,29 @@
"LabelAuthorLastFirst": "Auteur (Nom, Prénom)",
"LabelAuthors": "Auteurs",
"LabelAutoDownloadEpisodes": "Téléchargement automatique dépisode",
"LabelAutoFetchMetadata": "Auto Fetch Metadata",
"LabelAutoFetchMetadataHelp": "Fetches metadata for title, author, and series to streamline uploading. Additional metadata may have to be matched after upload.",
"LabelAutoLaunch": "Auto Launch",
"LabelAutoLaunchDescription": "Redirect to the auth provider automatically when navigating to the login page (manual override path <code>/login?autoLaunch=0</code>)",
"LabelAutoRegister": "Auto Register",
"LabelAutoRegisterDescription": "Automatically create new users after logging in",
"LabelBackToUser": "Revenir à lUtilisateur",
"LabelBackupLocation": "Backup Location",
"LabelAutoFetchMetadata": "Recherche automatique de métadonnées",
"LabelAutoFetchMetadataHelp": "Récupère les métadonnées du titre, de lauteur et de la série pour simplifier le téléchargement. Il se peut que des métadonnées supplémentaires doivent être ajoutées après le téléchargement.",
"LabelAutoLaunch": "Lancement automatique",
"LabelAutoLaunchDescription": "Redirection automatique vers le fournisseur d'authentification lors de la navigation vers la page de connexion (chemin de remplacement manuel <code>/login?autoLaunch=0</code>)",
"LabelAutoRegister": "Enregistrement automatique",
"LabelAutoRegisterDescription": "Créer automatiquement de nouveaux utilisateurs après la connexion",
"LabelBackToUser": "Retour à lutilisateur",
"LabelBackupLocation": "Emplacement de la sauvegarde",
"LabelBackupsEnableAutomaticBackups": "Activer les sauvegardes automatiques",
"LabelBackupsEnableAutomaticBackupsHelp": "Sauvegardes Enregistrées dans /metadata/backups",
"LabelBackupsEnableAutomaticBackupsHelp": "Sauvegardes enregistrées dans /metadata/backups",
"LabelBackupsMaxBackupSize": "Taille maximale de la sauvegarde (en Go)",
"LabelBackupsMaxBackupSizeHelp": "Afin de prévenir les mauvaises configuration, la sauvegarde échouera si elle excède la taille limite.",
"LabelBackupsNumberToKeep": "Nombre de sauvegardes à maintenir",
"LabelBackupsNumberToKeepHelp": "Une seule sauvegarde sera effacée à la fois. Si vous avez plus de sauvegardes à effacer, vous devrez le faire manuellement.",
"LabelBackupsNumberToKeep": "Nombre de sauvegardes à conserver",
"LabelBackupsNumberToKeepHelp": "Seule une sauvegarde sera supprimée à la fois. Si vous avez déjà plus de sauvegardes à effacer, vous devez les supprimer manuellement.",
"LabelBitrate": "Bitrate",
"LabelBooks": "Livres",
"LabelButtonText": "Button Text",
"LabelButtonText": "Texte du bouton",
"LabelChangePassword": "Modifier le mot de passe",
"LabelChannels": "Canaux",
"LabelChapters": "Chapitres",
"LabelChaptersFound": "Chapitres trouvés",
"LabelChapterTitle": "Titres du chapitre",
"LabelClickForMoreInfo": "Click for more info",
"LabelChaptersFound": "chapitres trouvés",
"LabelChapterTitle": "Titre du chapitre",
"LabelClickForMoreInfo": "Cliquez ici pour plus dinformations",
"LabelClosePlayer": "Fermer le lecteur",
"LabelCodec": "Codec",
"LabelCollapseSeries": "Réduire les séries",
@@ -235,20 +235,20 @@
"LabelCover": "Couverture",
"LabelCoverImageURL": "URL vers limage de couverture",
"LabelCreatedAt": "Créé le",
"LabelCronExpression": "Expression Cron",
"LabelCurrent": "Courrant",
"LabelCurrently": "En ce moment :",
"LabelCustomCronExpression": "Expression cron personnalisée:",
"LabelDatetime": "Datetime",
"LabelDeleteFromFileSystemCheckbox": "Delete from file system (uncheck to only remove from database)",
"LabelCronExpression": "Expression cron",
"LabelCurrent": "Actuel",
"LabelCurrently": "Actuellement :",
"LabelCustomCronExpression": "Expression cron personnalisée :",
"LabelDatetime": "Date",
"LabelDeleteFromFileSystemCheckbox": "Supprimer du système de fichiers (décocher pour ne supprimer que de la base de données)",
"LabelDescription": "Description",
"LabelDeselectAll": "Tout déselectionner",
"LabelDevice": "Appareil",
"LabelDeviceInfo": "Détail de lappareil",
"LabelDeviceIsAvailableTo": "Device is available to...",
"LabelDeviceIsAvailableTo": "Lappareil est disponible pour…",
"LabelDirectory": "Répertoire",
"LabelDiscFromFilename": "Disque depuis le fichier",
"LabelDiscFromMetadata": "Disque depuis les métadonnées",
"LabelDiscFromFilename": "Depuis le fichier",
"LabelDiscFromMetadata": "Depuis les métadonnées",
"LabelDiscover": "Découvrir",
"LabelDownload": "Téléchargement",
"LabelDownloadNEpisodes": "Télécharger {0} épisode(s)",
@@ -271,17 +271,17 @@
"LabelExample": "Exemple",
"LabelExplicit": "Restriction",
"LabelFeedURL": "URL du flux",
"LabelFetchingMetadata": "Fetching Metadata",
"LabelFetchingMetadata": "Récupération des métadonnées",
"LabelFile": "Fichier",
"LabelFileBirthtime": "Création du fichier",
"LabelFileModified": "Modification du fichier",
"LabelFilename": "Nom de fichier",
"LabelFilterByUser": "Filtrer par lutilisateur",
"LabelFilterByUser": "Filtrer par utilisateur",
"LabelFindEpisodes": "Trouver des épisodes",
"LabelFinished": "Fini(e)",
"LabelFinished": "Terminé le",
"LabelFolder": "Dossier",
"LabelFolders": "Dossiers",
"LabelFontFamily": "Famille de polices",
"LabelFontFamily": "Polices de caractères",
"LabelFontScale": "Taille de la police de caractère",
"LabelFormat": "Format",
"LabelGenre": "Genre",
@@ -289,16 +289,16 @@
"LabelHardDeleteFile": "Suppression du fichier",
"LabelHasEbook": "Dispose dun livre numérique",
"LabelHasSupplementaryEbook": "Dispose dun livre numérique supplémentaire",
"LabelHighestPriority": "Highest priority",
"LabelHighestPriority": "Priorité la plus élevée",
"LabelHost": "Hôte",
"LabelHour": "Heure",
"LabelIcon": "Icone",
"LabelImageURLFromTheWeb": "Image URL from the web",
"LabelIncludeInTracklist": "Inclure dans la liste des pistes",
"LabelIcon": "Icône",
"LabelImageURLFromTheWeb": "URL de limage à partir du web",
"LabelIncludeInTracklist": "Inclure dans la liste de lecture",
"LabelIncomplete": "Incomplet",
"LabelInProgress": "En cours",
"LabelInterval": "Intervalle",
"LabelIntervalCustomDailyWeekly": "Journalier / Hebdomadaire personnalisé",
"LabelIntervalCustomDailyWeekly": "Personnaliser quotidiennement / hebdomadairement",
"LabelIntervalEvery12Hours": "Toutes les 12 heures",
"LabelIntervalEvery15Minutes": "Toutes les 15 minutes",
"LabelIntervalEvery2Hours": "Toutes les 2 heures",
@@ -331,22 +331,22 @@
"LabelLogLevelInfo": "Info",
"LabelLogLevelWarn": "Warn",
"LabelLookForNewEpisodesAfterDate": "Chercher de nouveaux épisode après cette date",
"LabelLowestPriority": "Lowest Priority",
"LabelMatchExistingUsersBy": "Match existing users by",
"LabelMatchExistingUsersByDescription": "Used for connecting existing users. Once connected, users will be matched by a unique id from your SSO provider",
"LabelLowestPriority": "Priorité la plus basse",
"LabelMatchExistingUsersBy": "Faire correspondre les utilisateurs existants par",
"LabelMatchExistingUsersByDescription": "Utilisé pour connecter les utilisateurs existants. Une fois connectés, les utilisateurs seront associés à un identifiant unique provenant de votre fournisseur SSO.",
"LabelMediaPlayer": "Lecteur multimédia",
"LabelMediaType": "Type de média",
"LabelMetadataOrderOfPrecedenceDescription": "Higher priority metadata sources will override lower priority metadata sources",
"LabelMetadataOrderOfPrecedenceDescription": "Les sources de métadonnées ayant une priorité plus élevée auront la priorité sur celles ayant une priorité moins élevée.",
"LabelMetadataProvider": "Fournisseur de métadonnées",
"LabelMetaTag": "Etiquette de métadonnée",
"LabelMetaTags": "Etiquettes de métadonnée",
"LabelMetaTag": "Balise de métadonnée",
"LabelMetaTags": "Balises de métadonnée",
"LabelMinute": "Minute",
"LabelMissing": "Manquant",
"LabelMissingParts": "Parties manquantes",
"LabelMobileRedirectURIs": "Allowed Mobile Redirect URIs",
"LabelMobileRedirectURIsDescription": "This is a whitelist of valid redirect URIs for mobile apps. The default one is <code>audiobookshelf://oauth</code>, which you can remove or supplement with additional URIs for third-party app integration. Using an asterisk (<code>*</code>) as the sole entry permits any URI.",
"LabelMobileRedirectURIs": "URI de redirection mobile autorisés",
"LabelMobileRedirectURIsDescription": "Il s'agit d'une liste blanche dURI de redirection valides pour les applications mobiles. Celui par défaut est <code>audiobookshelf://oauth</code>, que vous pouvez supprimer ou compléter avec des URIs supplémentaires pour l'intégration d'applications tierces. Lutilisation dun astérisque (<code>*</code>) comme seule entrée autorise nimporte quel URI.",
"LabelMore": "Plus",
"LabelMoreInfo": "Plus dinfo",
"LabelMoreInfo": "Plus dinformations",
"LabelName": "Nom",
"LabelNarrator": "Narrateur",
"LabelNarrators": "Narrateurs",
@@ -358,7 +358,7 @@
"LabelNextScheduledRun": "Prochain lancement prévu",
"LabelNoEpisodesSelected": "Aucun épisode sélectionné",
"LabelNotes": "Notes",
"LabelNotFinished": "Non terminé(e)",
"LabelNotFinished": "Non terminé",
"LabelNotificationAppriseURL": "URL(s) dApprise",
"LabelNotificationAvailableVariables": "Variables disponibles",
"LabelNotificationBodyTemplate": "Modèle de Message",
@@ -367,10 +367,10 @@
"LabelNotificationsMaxFailedAttemptsHelp": "La notification est abandonnée une fois ce seuil atteint",
"LabelNotificationsMaxQueueSize": "Nombres de notifications maximum à mettre en attente",
"LabelNotificationsMaxQueueSizeHelp": "La limite de notification est de un évènement par seconde. Les notifications seront ignorées si la file dattente est à son maximum. Cela empêche un flot trop important.",
"LabelNotificationTitleTemplate": "Modèle de Titre",
"LabelNotStarted": "Non Démarré(e)",
"LabelNumberOfBooks": "Nombre de Livres",
"LabelNumberOfEpisodes": "Nombre dEpisodes",
"LabelNotificationTitleTemplate": "Modèle de titre",
"LabelNotStarted": "Pas commencé",
"LabelNumberOfBooks": "Nombre de livres",
"LabelNumberOfEpisodes": "Nombre dépisodes",
"LabelOpenRSSFeed": "Ouvrir le flux RSS",
"LabelOverwrite": "Écraser",
"LabelPassword": "Mot de passe",
@@ -406,12 +406,12 @@
"LabelRegion": "Région",
"LabelReleaseDate": "Date de parution",
"LabelRemoveCover": "Supprimer la couverture",
"LabelRowsPerPage": "Rows per page",
"LabelRowsPerPage": "Lignes par page",
"LabelRSSFeedCustomOwnerEmail": "Courriel du propriétaire personnalisé",
"LabelRSSFeedCustomOwnerName": "Nom propriétaire personnalisé",
"LabelRSSFeedOpen": "Flux RSS ouvert",
"LabelRSSFeedPreventIndexing": "Empêcher lindexation",
"LabelRSSFeedSlug": "Identificateur dadresse du Flux RSS ",
"LabelRSSFeedSlug": "Balise URL du flux RSS",
"LabelRSSFeedURL": "Adresse du flux RSS",
"LabelSearchTerm": "Terme de recherche",
"LabelSearchTitle": "Titre de recherche",
@@ -419,8 +419,8 @@
"LabelSeason": "Saison",
"LabelSelectAllEpisodes": "Sélectionner tous les épisodes",
"LabelSelectEpisodesShowing": "Sélectionner {0} episode(s) en cours",
"LabelSelectUsers": "Select users",
"LabelSendEbookToDevice": "Envoyer le livre numérique à...",
"LabelSelectUsers": "Sélectionner les utilisateurs",
"LabelSendEbookToDevice": "Envoyer le livre numérique à",
"LabelSequence": "Séquence",
"LabelSeries": "Séries",
"LabelSeriesName": "Nom de la série",
@@ -428,18 +428,18 @@
"LabelSetEbookAsPrimary": "Définir comme principale",
"LabelSetEbookAsSupplementary": "Définir comme supplémentaire",
"LabelSettingsAudiobooksOnly": "Livres audios seulement",
"LabelSettingsAudiobooksOnlyHelp": "Lactivation de ce paramètre ignorera les fichiers “ ebook ”, à moins quils ne se trouvent dans un dossier de livres audio, auquel cas ils seront définis comme des livres numériques supplémentaires.",
"LabelSettingsAudiobooksOnlyHelp": "L'activation de ce paramètre ignorera les fichiers de type « livre numériques », sauf s'ils se trouvent dans un dossier spécifique , auquel cas ils seront définis comme des livres numériques supplémentaires.",
"LabelSettingsBookshelfViewHelp": "Interface skeuomorphique avec une étagère en bois",
"LabelSettingsChromecastSupport": "Support du Chromecast",
"LabelSettingsDateFormat": "Format de date",
"LabelSettingsDisableWatcher": "Désactiver la surveillance",
"LabelSettingsDisableWatcherForLibrary": "Désactiver la surveillance des dossiers pour la bibliothèque",
"LabelSettingsDisableWatcherHelp": "Désactive la mise à jour automatique lorsque des modifications de fichiers sont détectées. *Nécessite le redémarrage du serveur",
"LabelSettingsDisableWatcherHelp": "Désactive la mise à jour automatique lorsque des modifications de fichiers sont détectées. * nécessite le redémarrage du serveur",
"LabelSettingsEnableWatcher": "Activer la veille",
"LabelSettingsEnableWatcherForLibrary": "Activer la surveillance des dossiers pour la bibliothèque",
"LabelSettingsEnableWatcherHelp": "Active la mise à jour automatique automatique lorsque des modifications de fichiers sont détectées. *Nécessite le redémarrage du serveur",
"LabelSettingsEnableWatcherHelp": "Active la mise à jour automatique automatique lorsque des modifications de fichiers sont détectées. * nécessite le redémarrage du serveur",
"LabelSettingsExperimentalFeatures": "Fonctionnalités expérimentales",
"LabelSettingsExperimentalFeaturesHelp": "Fonctionnalités en cours de développement sur lesquelles nous attendons votre retour et expérience. Cliquez pour ouvrir la discussion Github.",
"LabelSettingsExperimentalFeaturesHelp": "Fonctionnalités en cours de développement sur lesquelles nous attendons votre retour et expérience. Cliquez pour ouvrir la discussion GitHub.",
"LabelSettingsFindCovers": "Chercher des couvertures de livre",
"LabelSettingsFindCoversHelp": "Si votre livre audio ne possède pas de couverture intégrée ou une image de couverture dans le dossier, lanalyseur tentera de récupérer une couverture.<br>Attention, cela peut augmenter le temps danalyse.",
"LabelSettingsHideSingleBookSeries": "Masquer les séries de livres uniques",
@@ -447,13 +447,13 @@
"LabelSettingsHomePageBookshelfView": "La page daccueil utilise la vue étagère",
"LabelSettingsLibraryBookshelfView": "La bibliothèque utilise la vue étagère",
"LabelSettingsParseSubtitles": "Analyser les sous-titres",
"LabelSettingsParseSubtitlesHelp": "Extrait les sous-titres depuis le dossier du Livre Audio.<br>Les sous-titres doivent être séparés par « - »<br>i.e. « Titre du Livre - Ceci est un sous-titre » aura le sous-titre « Ceci est un sous-titre »",
"LabelSettingsParseSubtitlesHelp": "Extrait les sous-titres depuis le dossier du livre audio.<br>Les sous-titres doivent être séparés par « - »<br>cest-à-dire : « Titre du livre - Ceci est un sous-titre » aura le sous-titre « Ceci est un sous-titre »",
"LabelSettingsPreferMatchedMetadata": "Préférer les métadonnées par correspondance",
"LabelSettingsPreferMatchedMetadataHelp": "Les métadonnées par correspondance écrase les détails de larticle lors dune recherche par correspondance rapide. Par défaut, la recherche par correspondance rapide ne comblera que les éléments manquant.",
"LabelSettingsSkipMatchingBooksWithASIN": "Ignorer la recherche par correspondance sur les livres ayant déjà un ASIN",
"LabelSettingsSkipMatchingBooksWithISBN": "Ignorer la recherche par correspondance sur les livres ayant déjà un ISBN",
"LabelSettingsSortingIgnorePrefixes": "Ignorer les préfixes lors du tri",
"LabelSettingsSortingIgnorePrefixesHelp": "i.e. pour le préfixe « le », le livre avec pour titre « Le Titre du Livre » sera trié en tant que « Titre du Livre, Le »",
"LabelSettingsSortingIgnorePrefixesHelp": "cest-à-dire : pour le préfixe « le », le livre avec pour titre « Le Titre du Livre » sera trié en tant que « Titre du Livre, Le »",
"LabelSettingsSquareBookCovers": "Utiliser des couvertures carrées",
"LabelSettingsSquareBookCoversHelp": "Préférer les couvertures carrées par rapport aux couvertures standards de ratio 1.6:1.",
"LabelSettingsStoreCoversWithItem": "Enregistrer la couverture avec les articles",
@@ -461,30 +461,30 @@
"LabelSettingsStoreMetadataWithItem": "Enregistrer les Métadonnées avec les articles",
"LabelSettingsStoreMetadataWithItemHelp": "Par défaut, les métadonnées sont enregistrées dans /metadata/items",
"LabelSettingsTimeFormat": "Format dheure",
"LabelShowAll": "Afficher Tout",
"LabelShowAll": "Tout afficher",
"LabelSize": "Taille",
"LabelSleepTimer": "Minuterie",
"LabelSlug": "Slug",
"LabelSlug": "Balise",
"LabelStart": "Démarrer",
"LabelStarted": "Démarré",
"LabelStartedAt": "Démarré à",
"LabelStartTime": "Heure de Démarrage",
"LabelStartTime": "Heure de démarrage",
"LabelStatsAudioTracks": "Pistes Audios",
"LabelStatsAuthors": "Auteurs",
"LabelStatsBestDay": "Meilleur Jour",
"LabelStatsDailyAverage": "Moyenne Journalière",
"LabelStatsBestDay": "Meilleur jour",
"LabelStatsDailyAverage": "Moyenne journalière",
"LabelStatsDays": "Jours",
"LabelStatsDaysListened": "Jours découte",
"LabelStatsHours": "Heures",
"LabelStatsInARow": "daffilé(s)",
"LabelStatsInARow": "daffilée(s)",
"LabelStatsItemsFinished": "Articles terminés",
"LabelStatsItemsInLibrary": "Articles dans la Bibliothèque",
"LabelStatsItemsInLibrary": "Articles dans la bibliothèque",
"LabelStatsMinutes": "minutes",
"LabelStatsMinutesListening": "Minutes découte",
"LabelStatsOverallDays": "Jours au total",
"LabelStatsOverallHours": "Heures au total",
"LabelStatsOverallDays": "Nombre total de jours",
"LabelStatsOverallHours": "Nombre total d'heures",
"LabelStatsWeekListening": "Écoute de la semaine",
"LabelSubtitle": "Sous-Titre",
"LabelSubtitle": "Sous-titre",
"LabelSupportedFileTypes": "Types de fichiers supportés",
"LabelTag": "Étiquette",
"LabelTags": "Étiquettes",
@@ -496,23 +496,23 @@
"LabelThemeLight": "Clair",
"LabelTimeBase": "Base de temps",
"LabelTimeListened": "Temps découte",
"LabelTimeListenedToday": "Nombres découtes Aujourdhui",
"LabelTimeListenedToday": "Nombres découtes aujourdhui",
"LabelTimeRemaining": "{0} restantes",
"LabelTimeToShift": "Temps de décalage en secondes",
"LabelTitle": "Titre",
"LabelToolsEmbedMetadata": "Métadonnées Intégrées",
"LabelToolsEmbedMetadata": "Métadonnées intégrées",
"LabelToolsEmbedMetadataDescription": "Intègre les métadonnées au fichier audio avec la couverture et les chapitres.",
"LabelToolsMakeM4b": "Créer un fichier Livre Audio M4B",
"LabelToolsMakeM4bDescription": "Génère un fichier Livre Audio .M4B avec intégration des métadonnées, image de couverture et les chapitres.",
"LabelToolsMakeM4b": "Créer un fichier livre audio M4B",
"LabelToolsMakeM4bDescription": "Générer un fichier de livre audio .M4B avec des métadonnées intégrées, une image de couverture et des chapitres.",
"LabelToolsSplitM4b": "Scinde le fichier M4B en fichiers MP3",
"LabelToolsSplitM4bDescription": "Créer plusieurs fichier MP3 à partir du découpage par chapitre, en incluant les métadonnées, limage de couverture et les chapitres.",
"LabelTotalDuration": "Durée Totale",
"LabelTotalDuration": "Durée totale",
"LabelTotalTimeListened": "Temps découte total",
"LabelTrackFromFilename": "Piste depuis le fichier",
"LabelTrackFromMetadata": "Piste depuis les métadonnées",
"LabelTracks": "Pistes",
"LabelTracksMultiTrack": "Piste multiple",
"LabelTracksNone": "No tracks",
"LabelTracksNone": "Aucune piste",
"LabelTracksSingleTrack": "Piste simple",
"LabelType": "Type",
"LabelUnabridged": "Version intégrale",
@@ -524,9 +524,9 @@
"LabelUpdateDetailsHelp": "Autoriser la mise à jour des détails existants lorsquune correspondance est trouvée",
"LabelUploaderDragAndDrop": "Glisser et déposer des fichiers ou dossiers",
"LabelUploaderDropFiles": "Déposer des fichiers",
"LabelUploaderItemFetchMetadataHelp": "Automatically fetch title, author, and series",
"LabelUploaderItemFetchMetadataHelp": "Récupérer automatiquement le titre, lauteur et la série",
"LabelUseChapterTrack": "Utiliser la piste du chapitre",
"LabelUseFullTrack": "Utiliser la piste Complète",
"LabelUseFullTrack": "Utiliser la piste complète",
"LabelUser": "Utilisateur",
"LabelUsername": "Nom dutilisateur",
"LabelValue": "Valeur",
@@ -541,14 +541,14 @@
"LabelYourPlaylists": "Vos listes de lecture",
"LabelYourProgress": "Votre progression",
"MessageAddToPlayerQueue": "Ajouter en file dattente",
"MessageAppriseDescription": "Nécessite une instance d<a href=\"https://github.com/caronc/apprise-api\" target=\"_blank\">API Apprise</a> pour utiliser cette fonctionnalité ou une api qui prend en charge les mêmes requêtes. <br />lURL de lAPI Apprise doit comprendre le chemin complet pour envoyer la notification. Par exemple, si votre instance écoute sur <code>http://192.168.1.1:8337</code> alors vous devez mettre <code>http://192.168.1.1:8337/notify</code>.",
"MessageAppriseDescription": "Nécessite une instance d<a href=\"https://github.com/caronc/apprise-api\" target=\"_blank\">API Apprise</a> pour utiliser cette fonctionnalité ou une api qui prend en charge les mêmes requêtes.<br>LURL de lAPI Apprise doit comprendre le chemin complet pour envoyer la notification. Par exemple, si votre instance écoute sur <code>http://192.168.1.1:8337</code> alors vous devez mettre <code>http://192.168.1.1:8337/notify</code>.",
"MessageBackupsDescription": "Les sauvegardes incluent les utilisateurs, la progression de lecture par utilisateur, les détails des articles des bibliothèques, les paramètres du serveur et les images sauvegardées. Les sauvegardes nincluent pas les fichiers de votre bibliothèque.",
"MessageBatchQuickMatchDescription": "La recherche par correspondance rapide tentera dajouter les couvertures et les métadonnées manquantes pour les articles sélectionnés. Activer loption suivante pour autoriser la recherche par correspondance à écraser les données existantes.",
"MessageBookshelfNoCollections": "Vous navez pas encore de collections",
"MessageBookshelfNoResultsForFilter": "Aucun résultat pour le filtre « {0}: {1} »",
"MessageBookshelfNoResultsForFilter": "Aucun résultat pour le filtre « {0} : {1} »",
"MessageBookshelfNoRSSFeeds": "Aucun flux RSS nest ouvert",
"MessageBookshelfNoSeries": "Vous navez aucune série",
"MessageChapterEndIsAfter": "Le Chapitre Fin est situé à la fin de votre Livre Audio",
"MessageChapterEndIsAfter": "La fin du chapitre se situe après la fin de votre livre audio.",
"MessageChapterErrorFirstNotZero": "Le premier capitre doit débuter à 0",
"MessageChapterErrorStartGteDuration": "Horodatage invalide car il doit débuter avant la fin du livre",
"MessageChapterErrorStartLtPrev": "Horodatage invalide car il doit débuter au moins après le précédent chapitre",
@@ -558,15 +558,15 @@
"MessageConfirmDeleteBackup": "Êtes-vous sûr de vouloir supprimer la sauvegarde de « {0} » ?",
"MessageConfirmDeleteFile": "Cela supprimera le fichier de votre système de fichiers. Êtes-vous sûr ?",
"MessageConfirmDeleteLibrary": "Êtes-vous sûr de vouloir supprimer définitivement la bibliothèque « {0} » ?",
"MessageConfirmDeleteLibraryItem": "This will delete the library item from the database and your file system. Are you sure?",
"MessageConfirmDeleteLibraryItems": "This will delete {0} library items from the database and your file system. Are you sure?",
"MessageConfirmDeleteLibraryItem": "Cette opération supprimera lélément de la base de données et de votre système de fichiers. Êtes-vous sûr ?",
"MessageConfirmDeleteLibraryItems": "Cette opération supprimera {0} éléments de la base de données et de votre système de fichiers. Êtes-vous sûr ?",
"MessageConfirmDeleteSession": "Êtes-vous sûr de vouloir supprimer cette session ?",
"MessageConfirmForceReScan": "Êtes-vous sûr de vouloir lancer une analyse forcée ?",
"MessageConfirmMarkAllEpisodesFinished": "Êtes-vous sûr de marquer tous les épisodes comme terminés ?",
"MessageConfirmMarkAllEpisodesNotFinished": "Êtes-vous sûr de vouloir marquer tous les épisodes comme non terminés ?",
"MessageConfirmMarkSeriesFinished": "Êtes-vous sûr de vouloir marquer tous les livres de cette série comme terminées ?",
"MessageConfirmMarkSeriesNotFinished": "Êtes-vous sûr de vouloir marquer tous les livres de cette série comme comme non terminés ?",
"MessageConfirmQuickEmbed": "Warning! Quick embed will not backup your audio files. Make sure that you have a backup of your audio files. <br><br>Would you like to continue?",
"MessageConfirmQuickEmbed": "Attention ! Lintégration rapide ne sauvegardera pas vos fichiers audio. Assurez-vous davoir effectuer une sauvegarde de vos fichiers audio.<br><br>Souhaitez-vous continuer ?",
"MessageConfirmRemoveAllChapters": "Êtes-vous sûr de vouloir supprimer tous les chapitres ?",
"MessageConfirmRemoveAuthor": "Are you sure you want to remove author \"{0}\"?",
"MessageConfirmRemoveCollection": "Êtes-vous sûr de vouloir supprimer la collection « {0} » ?",
@@ -581,16 +581,16 @@
"MessageConfirmRenameTag": "Êtes-vous sûr de vouloir renommer létiquette « {0} » en « {1} » pour tous les articles ?",
"MessageConfirmRenameTagMergeNote": "Information: Cette étiquette existe déjà et sera fusionnée.",
"MessageConfirmRenameTagWarning": "Attention ! Une étiquette similaire avec une casse différente existe déjà « {0} ».",
"MessageConfirmReScanLibraryItems": "Are you sure you want to re-scan {0} items?",
"MessageConfirmReScanLibraryItems": "Êtes-vous sûr de vouloir re-analyser {0} éléments ?",
"MessageConfirmSendEbookToDevice": "Êtes-vous sûr de vouloir envoyer le livre numérique {0} « {1} » à lappareil « {2} »?",
"MessageDownloadingEpisode": "Téléchargement de lépisode",
"MessageDragFilesIntoTrackOrder": "Faire glisser les fichiers dans lordre correct",
"MessageDragFilesIntoTrackOrder": "Faites glisser les fichiers dans lordre correct des pistes",
"MessageEmbedFinished": "Intégration terminée !",
"MessageEpisodesQueuedForDownload": "{0} épisode(s) mis en file pour téléchargement",
"MessageFeedURLWillBe": "lURL du flux sera {0}",
"MessageFeedURLWillBe": "LURL du flux sera {0}",
"MessageFetching": "Récupération…",
"MessageForceReScanDescription": "Analysera tous les fichiers de nouveau. Les étiquettes ID3 des fichiers audios, fichiers OPF, et les fichiers textes seront analysés comme sils étaient nouveaux.",
"MessageImportantNotice": "Information Importante !",
"MessageForceReScanDescription": "analysera de nouveau tous les fichiers. Les étiquettes ID3 des fichiers audio, les fichiers OPF et les fichiers texte seront analysés comme sils étaient nouveaux.",
"MessageImportantNotice": "Information importante !",
"MessageInsertChapterBelow": "Insérer le chapitre ci-dessous",
"MessageItemsSelected": "{0} articles sélectionnés",
"MessageItemsUpdated": "{0} articles mis à jour",
@@ -646,13 +646,13 @@
"MessageRemoveChapter": "Supprimer le chapitre",
"MessageRemoveEpisodes": "Suppression de {0} épisode(s)",
"MessageRemoveFromPlayerQueue": "Supprimer de la liste découte",
"MessageRemoveUserWarning": "Êtes-vous certain de vouloir supprimer définitivement lutilisateur « {0} » ?",
"MessageRemoveUserWarning": "Êtes-vous sûr de vouloir supprimer définitivement lutilisateur « {0} » ?",
"MessageReportBugsAndContribute": "Remonter des anomalies, demander des fonctionnalités et contribuer sur",
"MessageResetChaptersConfirm": "Êtes-vous certain de vouloir réinitialiser les chapitres et annuler les changements effectués ?",
"MessageRestoreBackupConfirm": "Êtes-vous certain de vouloir restaurer la sauvegarde créée le",
"MessageRestoreBackupWarning": "Restaurer la sauvegarde écrasera la base de donnée située dans le dossier /config ainsi que les images sur /metadata/items et /metadata/authors.<br /><br />Les sauvegardes ne touchent pas aux fichiers de la bibliothèque. Si vous avez activé le paramètre pour sauvegarder les métadonnées et les images de couverture dans le même dossier que les fichiers, ceux-ci ne ni sauvegardés, ni écrasés lors de la restauration.<br /><br />Tous les clients utilisant votre serveur seront automatiquement mis à jour.",
"MessageResetChaptersConfirm": "Êtes-vous sûr de vouloir réinitialiser les chapitres et annuler les changements effectués ?",
"MessageRestoreBackupConfirm": "Êtes-vous sûr de vouloir restaurer la sauvegarde créée le",
"MessageRestoreBackupWarning": "Restaurer la sauvegarde écrasera la base de donnée située dans le dossier /config ainsi que les images sur /metadata/items et /metadata/authors.<br><br>Les sauvegardes ne touchent pas aux fichiers de la bibliothèque. Si vous avez activé le paramètre pour sauvegarder les métadonnées et les images de couverture dans le même dossier que les fichiers, ceux-ci ne ni sauvegardés, ni écrasés lors de la restauration.<br><br>Tous les clients utilisant votre serveur seront automatiquement mis à jour.",
"MessageSearchResultsFor": "Résultats de recherche pour",
"MessageSelected": "{0} selected",
"MessageSelected": "{0} sélectionnés",
"MessageServerCouldNotBeReached": "Serveur inaccessible",
"MessageSetChaptersFromTracksDescription": "Positionne un chapitre par fichier audio, avec le titre du fichier comme titre de chapitre",
"MessageStartPlaybackAtTime": "Démarrer la lecture pour « {0} » à {1} ?",
@@ -663,12 +663,11 @@
"MessageValidCronExpression": "Expression cron valide",
"MessageWatcherIsDisabledGlobally": "La surveillance est désactivée par un paramètre global du serveur",
"MessageXLibraryIsEmpty": "La bibliothèque {0} est vide !",
"MessageYourAudiobookDurationIsLonger": "La durée de votre Livre Audio est plus longue que la durée trouvée",
"MessageYourAudiobookDurationIsShorter": "La durée de votre Livre Audio est plus courte que la durée trouvée",
"MessageYourAudiobookDurationIsLonger": "La durée de votre livre audio est plus longue que la durée trouvée",
"MessageYourAudiobookDurationIsShorter": "La durée de votre livre audio est plus courte que la durée trouvée",
"NoteChangeRootPassword": "seul lutilisateur « root » peut utiliser un mot de passe vide",
"NoteChapterEditorTimes": "Information : lhorodatage du premier chapitre doit être à 0:00 et celui du dernier chapitre ne peut se situer au-delà de la durée du Livre Audio.",
"NoteChapterEditorTimes": "Information : lhorodatage du premier chapitre doit être à 0:00 et celui du dernier chapitre ne peut se situer au-delà de la durée du livre audio.",
"NoteFolderPicker": "Information : Les dossiers déjà surveillés ne sont pas affichés",
"NoteFolderPickerDebian": "Information : La sélection de dossier sur une installation debian nest pas finalisée. Merci de renseigner le chemin complet vers votre bibliothèque manuellement.",
"NoteRSSFeedPodcastAppsHttps": "Attention : la majorité des application de podcast nécessite une adresse de flux en HTTPS.",
"NoteRSSFeedPodcastAppsPubDate": "Attention : un ou plusieurs de vos épisodes ne possèdent pas de date de publication. Certaines applications de podcast le requièrent.",
"NoteUploaderFoldersWithMediaFiles": "Les dossiers contenant des fichiers multimédias seront traités comme des éléments distincts de la bibliothèque.",
@@ -677,8 +676,8 @@
"PlaceholderNewCollection": "Nom de la nouvelle collection",
"PlaceholderNewFolderPath": "Nouveau chemin de dossier",
"PlaceholderNewPlaylist": "Nouveau nom de liste de lecture",
"PlaceholderSearch": "Recherche...",
"PlaceholderSearchEpisode": "Recherche dépisode...",
"PlaceholderSearch": "Recherche",
"PlaceholderSearchEpisode": "Recherche dépisode",
"ToastAccountUpdateFailed": "Échec de la mise à jour du compte",
"ToastAccountUpdateSuccess": "Compte mis à jour",
"ToastAuthorImageRemoveFailed": "Échec de la suppression de limage",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root user is the only user that can have an empty password",
"NoteChapterEditorTimes": "Note: First chapter start time must remain at 0:00 and the last chapter start time cannot exceed this audiobooks duration.",
"NoteFolderPicker": "Note: folders already mapped will not be shown",
"NoteFolderPickerDebian": "Note: Folder picker for the debian install is not fully implemented. You should enter the path to your library directly.",
"NoteRSSFeedPodcastAppsHttps": "Warning: Most podcast apps will require the RSS feed URL is using HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Warning: 1 or more of your episodes do not have a Pub Date. Some podcast apps require this.",
"NoteUploaderFoldersWithMediaFiles": "Folders with media files will be handled as separate library items.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "रूट user is the only user that can have an empty password",
"NoteChapterEditorTimes": "Note: First chapter start time must remain at 0:00 and the last chapter start time cannot exceed this audiobooks duration.",
"NoteFolderPicker": "Note: folders already mapped will not be shown",
"NoteFolderPickerDebian": "Note: Folder picker for the debian install is not fully implemented. You should enter the path to your library directly.",
"NoteRSSFeedPodcastAppsHttps": "Warning: Most podcast apps will require the RSS feed URL is using HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Warning: 1 or more of your episodes do not have a Pub Date. Some podcast apps require this.",
"NoteUploaderFoldersWithMediaFiles": "Folders with media files will be handled as separate library items.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root korisnik je jedini korisnik koji može imati praznu lozinku",
"NoteChapterEditorTimes": "Bilješka: Prvo početno vrijeme poglavlja mora ostati na 0:00 i posljednje vrijeme poglavlja ne smije preći vrijeme trajanja ove audio knjige.",
"NoteFolderPicker": "Bilješka: več mapirani folderi neće biti prikazani",
"NoteFolderPickerDebian": "Bilješka: Folder picker za debian instalaciju nije potpuno implementiran. Trebate unjeti direktnu putanju do biblioteke.",
"NoteRSSFeedPodcastAppsHttps": "Upozorenje: Večina podcasta će trebati RSS feed URL koji koristi HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Upozorenje: 1 ili više vaših epizoda nemaju datum objavljivanja. Neke podcast aplikacije zahtjevaju to.",
"NoteUploaderFoldersWithMediaFiles": "Folderi sa media datotekama će biti tretirane kao odvojene stavke u biblioteki.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "L'utente root è l'unico utente che può avere una password vuota",
"NoteChapterEditorTimes": "Nota: l'ora di inizio del primo capitolo deve rimanere alle 0:00 e l'ora di inizio dell'ultimo capitolo non può superare la durata di questo audiolibro.",
"NoteFolderPicker": "Nota: le cartelle già mappate non verranno visualizzate",
"NoteFolderPickerDebian": "Nota: il selettore di cartelle per l'installazione di Debian non è completamente implementato. Dovresti inserire direttamente il percorso della tua libreria.",
"NoteRSSFeedPodcastAppsHttps": "Avviso: la maggior parte delle app di podcast richiede che l'URL del feed RSS utilizzi HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Avviso: 1 o più delle tue puntate non hanno una data di pubblicazione. Alcune app di podcast lo richiedono.",
"NoteUploaderFoldersWithMediaFiles": "Le cartelle con file multimediali verranno gestite come elementi della libreria separati.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Tik root vartotojas gali turėti tuščią slaptažodį",
"NoteChapterEditorTimes": "Pastaba: Pirmasis skyriaus pradžios laikas turi likti 0:00, o paskutinio skyriaus pradžios laikas negali viršyti šios garso knygos trukmės.",
"NoteFolderPicker": "Pastaba: jau susieti aplankai nebus rodomi",
"NoteFolderPickerDebian": "Pastaba: Aplanko pasirinkimo įrankis „Debian“ sistemoje nėra visiškai įgyvendintas. Turėtumėte tiesiogiai įvesti kelią į savo biblioteką.",
"NoteRSSFeedPodcastAppsHttps": "Įspėjimas: Dauguma tinklalaidžių programų reikalauja, kad RSS kanalo URL būtų naudojamas su HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Įspėjimas: Vienas ar daugiau jūsų epizodų neturi publikavimo datos. Kai kurios tinklalaidžių programos to reikalauja.",
"NoteUploaderFoldersWithMediaFiles": "Aplankai su medijos failais bus tvarkomi kaip atskiri bibliotekos elementai.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root-gebruiker is de enige gebruiker die een leeg wachtwoord kan hebben",
"NoteChapterEditorTimes": "Opmerking: Starttijd van het eerste hoofdstuk moet op 0:00 blijven en de starttijd van het laatste hoofdstuk mag niet de duur van het audioboek overschrijden.",
"NoteFolderPicker": "Opmerking: Reeds gemapte mappen worden niet getoond",
"NoteFolderPickerDebian": "Opmerking: Mappenkiezer voor de debian installatie is niet volledig geimplementeerd. Je moet het pad naar je map zelf invoeren.",
"NoteRSSFeedPodcastAppsHttps": "Waarschuwing: De meeste podcast-apps zullen eisen dat de RSS-feed URL HTTPS gebruikt",
"NoteRSSFeedPodcastAppsPubDate": "Waarschuwing: 1 of meer van je afleveringen hebben geen Pub Datum. Sommige podcast-apps vereisen dit.",
"NoteUploaderFoldersWithMediaFiles": "Mappen met mediabestanden zullen worden behandeld als aparte bibliotheekonderdelen.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root-bruker er eneste bruker som kan ha tumt passord",
"NoteChapterEditorTimes": "Notis: Første kapittel start tid må være 0:00 og siste kapittel start tid kan ikke overskride denne lydbokens lengde.",
"NoteFolderPicker": "Notis: allerede funnet mapper vil ikke bli vist",
"NoteFolderPickerDebian": "Notis: Mappevelger for debian er ikke fullstendig implementert. Du burde skrive inn stien til biblioteket direkte.",
"NoteRSSFeedPodcastAppsHttps": "Advarsel! De fleste podcast applikasjoner trenger RSS feed URL som bruker HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Advarsel! 1 eller flere av episodene har ikke publikasjonsdato. Noen podcast applikasjoner trenger dette.",
"NoteUploaderFoldersWithMediaFiles": "Mapper med mediefiler vil bli behandlet som separate bibliotekgjenstander.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Tylko użytkownik root, może posiadać puste hasło",
"NoteChapterEditorTimes": "Uwaga: Czas rozpoczęcia pierwszego rozdziału musi pozostać na poziomie 0:00, a czas rozpoczęcia ostatniego rozdziału nie może przekroczyć czasu trwania audiobooka.",
"NoteFolderPicker": "Uwaga: dotychczas zmapowane foldery nie zostaną wyświetlone",
"NoteFolderPickerDebian": "Uwaga: Wybór folderu w instalcji opartej o system debian nie jest w pełni zaimplementowany. Powinieneś wprowadzić ścieżkę do swojej biblioteki bezpośrednio.",
"NoteRSSFeedPodcastAppsHttps": "Ostrzeżenie: Większość aplikacji do obsługi podcastów wymaga, aby adres URL kanału RSS korzystał z protokołu HTTPS.",
"NoteRSSFeedPodcastAppsPubDate": "Ostrzeżenie: 1 lub więcej odcinków nie ma daty publikacji. Niektóre aplikacje do słuchania podcastów tego wymagają.",
"NoteUploaderFoldersWithMediaFiles": "Foldery z plikami multimedialnymi będą traktowane jako osobne elementy w bibliotece.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Пользователь root — единственный пользователь, который может иметь пустой пароль",
"NoteChapterEditorTimes": "Примечание: Время начала первой главы должно оставаться в 0:00, а время начала последней главы не может превышать продолжительность этой аудиокниги.",
"NoteFolderPicker": "Примечание: папки, уже сопоставленные, не будут отображаться",
"NoteFolderPickerDebian": "Примечание: Выбор папок debian не реализован полностью. Необходимо ввести путь к библиотеке напрямую.",
"NoteRSSFeedPodcastAppsHttps": "Предупреждение: Большинству приложений подкастов потребуется, чтобы URL-адрес RSS-канала использовал HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Предупреждение: 1 или более эпизодов не имеют даты публикации. Некоторые приложения для подкастов требуют этого.",
"NoteUploaderFoldersWithMediaFiles": "Папки с медиафайлами будут обрабатываться как отдельные элементы библиотеки.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Rotanvändaren är den enda användaren som kan ha ett tomt lösenord",
"NoteChapterEditorTimes": "Obs: Starttiden för första kapitlet måste förbli 0:00 och starttiden för det sista kapitlet får inte överstiga ljudbokens varaktighet.",
"NoteFolderPicker": "Obs: Mappar som redan är kartlagda kommer inte att visas",
"NoteFolderPickerDebian": "Obs: Mappväljaren för Debian-installationen är inte fullständigt implementerad. Du bör ange sökvägen till ditt bibliotek direkt.",
"NoteRSSFeedPodcastAppsHttps": "Varning: De flesta podcastappar kräver att RSS-flödets URL används med HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "Varning: 1 eller flera av dina avsnitt har inte ett publiceringsdatum. Vissa podcastappar kräver detta.",
"NoteUploaderFoldersWithMediaFiles": "Mappar med mediefiler hanteras som separata biblioteksobjekt.",

View File

@@ -668,7 +668,6 @@
"NoteChangeRootPassword": "Root 是唯一可以拥有空密码的用户",
"NoteChapterEditorTimes": "注意: 第一章开始时间必须保持在 0:00, 最后一章开始时间不能超过有声读物持续时间.",
"NoteFolderPicker": "注意: 将不显示已映射的文件夹",
"NoteFolderPickerDebian": "注意: debian 安装的文件夹选择器尚未完全实现. 您应该直接输入媒体库的路径.",
"NoteRSSFeedPodcastAppsHttps": "警告: 大多数播客应用程序都需要 RSS 源 URL 使用 HTTPS",
"NoteRSSFeedPodcastAppsPubDate": "警告: 您的一集或多集没有发布日期. 一些播客应用程序要求这样做.",
"NoteUploaderFoldersWithMediaFiles": "包含媒体文件的文件夹将作为单独的媒体库项目处理.",

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "audiobookshelf",
"version": "2.7.0",
"version": "2.7.2",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "audiobookshelf",
"version": "2.7.0",
"version": "2.7.2",
"license": "GPL-3.0",
"dependencies": {
"axios": "^0.27.2",

View File

@@ -1,6 +1,6 @@
{
"name": "audiobookshelf",
"version": "2.7.0",
"version": "2.7.2",
"buildNumber": 1,
"description": "Self-hosted audiobook and podcast server",
"main": "index.js",

View File

@@ -39,13 +39,15 @@ Audiobookshelf is a self-hosted audiobook and podcast server.
Is there a feature you are looking for? [Suggest it](https://github.com/advplyr/audiobookshelf/issues/new/choose)
Join us on [Discord](https://discord.gg/pJsjuNCKRq) or [Matrix](https://matrix.to/#/#audiobookshelf:matrix.org)
Join us on [Discord](https://discord.gg/HQgCbd6E75) or [Matrix](https://matrix.to/#/#audiobookshelf:matrix.org)
### Android App (beta)
Try it out on the [Google Play Store](https://play.google.com/store/apps/details?id=com.audiobookshelf.app)
### iOS App (beta)
Available using Test Flight: https://testflight.apple.com/join/wiic7QIW - [Join the discussion](https://github.com/advplyr/audiobookshelf-app/discussions/60)
**Beta is currently full. Apple has a hard limit of 10k beta testers. Updates will be posted in Discord/Matrix.**
Using Test Flight: https://testflight.apple.com/join/wiic7QIW ***(beta is full)***
### Build your own tools & clients
Check out the [API documentation](https://api.audiobookshelf.org/)

View File

@@ -177,11 +177,11 @@ class Database {
if (process.env.QUERY_LOGGING === "log") {
// Setting QUERY_LOGGING=log will log all Sequelize queries before they run
Logger.info(`[Database] Query logging enabled`)
logging = (query) => Logger.dev(`Running the following query:\n ${query}`)
logging = (query) => Logger.debug(`Running the following query:\n ${query}`)
} else if (process.env.QUERY_LOGGING === "benchmark") {
// Setting QUERY_LOGGING=benchmark will log all Sequelize queries and their execution times, after they run
Logger.info(`[Database] Query benchmarking enabled"`)
logging = (query, time) => Logger.dev(`Ran the following query in ${time}ms:\n ${query}`)
logging = (query, time) => Logger.debug(`Ran the following query in ${time}ms:\n ${query}`)
benchmark = true
}

View File

@@ -5,7 +5,6 @@ class Logger {
constructor() {
this.isDev = process.env.NODE_ENV !== 'production'
this.logLevel = !this.isDev ? LogLevel.INFO : LogLevel.TRACE
this.hideDevLogs = process.env.HIDE_DEV_LOGS === undefined ? !this.isDev : process.env.HIDE_DEV_LOGS === '1'
this.socketListeners = []
this.logManager = null
@@ -88,15 +87,6 @@ class Logger {
this.debug(`Set Log Level to ${this.levelString}`)
}
/**
* Only to console and only for development
* @param {...any} args
*/
dev(...args) {
if (this.hideDevLogs) return
console.log(`[${this.timestamp}] DEV:`, ...args)
}
trace(...args) {
if (this.logLevel > LogLevel.TRACE) return
console.trace(`[${this.timestamp}] TRACE:`, ...args)

View File

@@ -33,6 +33,7 @@ const AudioMetadataMangaer = require('./managers/AudioMetadataManager')
const RssFeedManager = require('./managers/RssFeedManager')
const CronManager = require('./managers/CronManager')
const ApiCacheManager = require('./managers/ApiCacheManager')
const BinaryManager = require('./managers/BinaryManager')
const LibraryScanner = require('./scanner/LibraryScanner')
//Import the main Passport and Express-Session library
@@ -74,6 +75,7 @@ class Server {
this.rssFeedManager = new RssFeedManager()
this.cronManager = new CronManager(this.podcastManager)
this.apiCacheManager = new ApiCacheManager()
this.binaryManager = new BinaryManager()
// Routers
this.apiRouter = new ApiRouter(this)
@@ -120,6 +122,11 @@ class Server {
await this.cronManager.init(libraries)
this.apiCacheManager.init()
// Download ffmpeg & ffprobe if not found (Currently only in use for Windows installs)
if (global.isWin || Logger.isDev) {
await this.binaryManager.init()
}
if (Database.serverSettings.scannerDisableWatcher) {
Logger.info(`[Server] Watcher is disabled`)
this.watcher.disabled = true
@@ -136,15 +143,16 @@ class Server {
/**
* @temporary
* This is necessary for the ebook API endpoint in the mobile apps
* This is necessary for the ebook & cover API endpoint in the mobile apps
* The mobile app ereader is using fetch api in Capacitor that is currently difficult to switch to native requests
* so we have to allow cors for specific origins to the /api/items/:id/ebook endpoint
* The cover image is fetched with XMLHttpRequest in the mobile apps to load into a canvas and extract colors
* @see https://ionicframework.com/docs/troubleshooting/cors
*
* Running in development allows cors to allow testing the mobile apps in the browser
*/
app.use((req, res, next) => {
if (Logger.isDev || req.path.match(/\/api\/items\/([a-z0-9-]{36})\/ebook(\/[0-9]+)?/)) {
if (Logger.isDev || req.path.match(/\/api\/items\/([a-z0-9-]{36})\/(ebook|cover)(\/[0-9]+)?/)) {
const allowedOrigins = ['capacitor://localhost', 'http://localhost']
if (Logger.isDev || allowedOrigins.some(o => o === req.get('origin'))) {
res.header('Access-Control-Allow-Origin', req.get('origin'))
@@ -276,6 +284,19 @@ class Server {
})
app.get('/healthcheck', (req, res) => res.sendStatus(200))
let sigintAlreadyReceived = false
process.on('SIGINT', async () => {
if (!sigintAlreadyReceived) {
sigintAlreadyReceived = true
Logger.info('SIGINT (Ctrl+C) received. Shutting down...')
await this.stop()
Logger.info('Server stopped. Exiting.')
} else {
Logger.info('SIGINT (Ctrl+C) received again. Exiting immediately.')
}
process.exit(0)
})
this.server.listen(this.Port, this.Host, () => {
if (this.Host) Logger.info(`Listening on http://${this.Host}:${this.Port}`)
else Logger.info(`Listening on port :${this.Port}`)
@@ -382,12 +403,17 @@ class Server {
res.sendStatus(200)
}
/**
* Gracefully stop server
* Stops watcher and socket server
*/
async stop() {
Logger.info('=== Stopping Server ===')
await this.watcher.close()
Logger.info('Watcher Closed')
return new Promise((resolve) => {
this.server.close((err) => {
SocketAuthority.close((err) => {
if (err) {
Logger.error('Failed to close server', err)
} else {

View File

@@ -73,6 +73,20 @@ class SocketAuthority {
}
}
/**
* Closes the Socket.IO server and disconnect all clients
*
* @param {Function} callback
*/
close(callback) {
Logger.info('[SocketAuthority] Shutting down')
// This will close all open socket connections, and also close the underlying http server
if (this.io)
this.io.close(callback)
else
callback()
}
initialize(Server) {
this.Server = Server

View File

@@ -1,31 +1,69 @@
const Path = require('path')
const Logger = require('../Logger')
const Database = require('../Database')
const fs = require('../libs/fsExtra')
const { toNumber } = require('../utils/index')
const fileUtils = require('../utils/fileUtils')
class FileSystemController {
constructor() { }
/**
*
* @param {import('express').Request} req
* @param {import('express').Response} res
*/
async getPaths(req, res) {
if (!req.user.isAdminOrUp) {
Logger.error(`[FileSystemController] Non-admin user attempting to get filesystem paths`, req.user)
return res.sendStatus(403)
}
const excludedDirs = ['node_modules', 'client', 'server', '.git', 'static', 'build', 'dist', 'metadata', 'config', 'sys', 'proc'].map(dirname => {
return Path.sep + dirname
})
const relpath = req.query.path
const level = toNumber(req.query.level, 0)
// Do not include existing mapped library paths in response
const libraryFoldersPaths = await Database.libraryFolderModel.getAllLibraryFolderPaths()
libraryFoldersPaths.forEach((path) => {
let dir = path || ''
if (dir.includes(global.appRoot)) dir = dir.replace(global.appRoot, '')
excludedDirs.push(dir)
// Validate path. Must be absolute
if (relpath && (!Path.isAbsolute(relpath) || !await fs.pathExists(relpath))) {
Logger.error(`[FileSystemController] Invalid path in query string "${relpath}"`)
return res.status(400).send('Invalid "path" query string')
}
Logger.debug(`[FileSystemController] Getting file paths at ${relpath || 'root'} (${level})`)
let directories = []
// Windows returns drives first
if (global.isWin) {
if (relpath) {
directories = await fileUtils.getDirectoriesInPath(relpath, level)
} else {
const drives = await fileUtils.getWindowsDrives().catch((error) => {
Logger.error(`[FileSystemController] Failed to get windows drives`, error)
return []
})
if (drives.length) {
directories = drives.map(d => {
return {
path: d,
dirname: d,
level: 0
}
})
}
}
} else {
directories = await fileUtils.getDirectoriesInPath(relpath || '/', level)
}
// Exclude some dirs from this project to be cleaner in Docker
const excludedDirs = ['node_modules', 'client', 'server', '.git', 'static', 'build', 'dist', 'metadata', 'config', 'sys', 'proc', '.devcontainer', '.nyc_output', '.github', '.vscode'].map(dirname => {
return fileUtils.filePathToPOSIX(Path.join(global.appRoot, dirname))
})
directories = directories.filter(dir => {
return !excludedDirs.includes(dir.path)
})
res.json({
directories: await this.getDirectories(global.appRoot, '/', excludedDirs)
posix: !global.isWin,
directories
})
}

View File

@@ -0,0 +1,315 @@
const os = require('os')
const path = require('path')
const axios = require('axios')
const fse = require('../fsExtra')
const async = require('../async')
const StreamZip = require('../nodeStreamZip')
const { finished } = require('stream/promises')
var API_URL = 'https://ffbinaries.com/api/v1'
var RUNTIME_CACHE = {}
var errorMsgs = {
connectionIssues: 'Couldn\'t connect to ffbinaries.com API. Check your Internet connection.',
parsingVersionData: 'Couldn\'t parse retrieved version data.',
parsingVersionList: 'Couldn\'t parse the list of available versions.',
notFound: 'Requested data not found.',
incorrectVersionParam: '"version" parameter must be a string.'
}
function ensureDirSync(dir) {
try {
fse.accessSync(dir)
} catch (e) {
fse.mkdirSync(dir)
}
}
/**
* Resolves the platform key based on input string
*/
function resolvePlatform(input) {
var rtn = null
switch (input) {
case 'mac':
case 'osx':
case 'mac-64':
case 'osx-64':
rtn = 'osx-64'
break
case 'linux':
case 'linux-32':
rtn = 'linux-32'
break
case 'linux-64':
rtn = 'linux-64'
break
case 'linux-arm':
case 'linux-armel':
rtn = 'linux-armel'
break
case 'linux-armhf':
rtn = 'linux-armhf'
break
case 'win':
case 'win-32':
case 'windows':
case 'windows-32':
rtn = 'windows-32'
break
case 'win-64':
case 'windows-64':
rtn = 'windows-64'
break
default:
rtn = null
}
return rtn
}
/**
* Detects the platform of the machine the script is executed on.
* Object can be provided to detect platform from info derived elsewhere.
*
* @param {object} osinfo Contains "type" and "arch" properties
*/
function detectPlatform(osinfo) {
var inputIsValid = typeof osinfo === 'object' && typeof osinfo.type === 'string' && typeof osinfo.arch === 'string'
var type = (inputIsValid ? osinfo.type : os.type()).toLowerCase()
var arch = (inputIsValid ? osinfo.arch : os.arch()).toLowerCase()
if (type === 'darwin') {
return 'osx-64'
}
if (type === 'windows_nt') {
return arch === 'x64' ? 'windows-64' : 'windows-32'
}
if (type === 'linux') {
if (arch === 'arm' || arch === 'arm64') {
return 'linux-armel'
}
return arch === 'x64' ? 'linux-64' : 'linux-32'
}
return null
}
/**
* Gets the binary filename (appends exe in Windows)
*
* @param {string} component "ffmpeg", "ffplay", "ffprobe" or "ffserver"
* @param {platform} platform "ffmpeg", "ffplay", "ffprobe" or "ffserver"
*/
function getBinaryFilename(component, platform) {
var platformCode = resolvePlatform(platform)
if (platformCode === 'windows-32' || platformCode === 'windows-64') {
return component + '.exe'
}
return component
}
function listPlatforms() {
return ['osx-64', 'linux-32', 'linux-64', 'linux-armel', 'linux-armhf', 'windows-32', 'windows-64']
}
/**
*
* @returns {Promise<string[]>} array of version strings
*/
function listVersions() {
if (RUNTIME_CACHE.versionsAll) {
return RUNTIME_CACHE.versionsAll
}
return axios.get(API_URL).then((res) => {
if (!res.data?.versions || !Object.keys(res.data.versions)?.length) {
throw new Error(errorMsgs.parsingVersionList)
}
const versionKeys = Object.keys(res.data.versions)
RUNTIME_CACHE.versionsAll = versionKeys
return versionKeys
})
}
/**
* Gets full data set from ffbinaries.com
*/
function getVersionData(version) {
if (RUNTIME_CACHE[version]) {
return RUNTIME_CACHE[version]
}
if (version && typeof version !== 'string') {
throw new Error(errorMsgs.incorrectVersionParam)
}
var url = version ? '/version/' + version : '/latest'
return axios.get(`${API_URL}${url}`).then((res) => {
RUNTIME_CACHE[version] = res.data
return res.data
}).catch((error) => {
if (error.response?.status == 404) {
throw new Error(errorMsgs.notFound)
} else {
throw new Error(errorMsgs.connectionIssues)
}
})
}
/**
* Download file(s) and save them in the specified directory
*/
async function downloadUrls(components, urls, opts) {
const destinationDir = opts.destination
const results = []
const remappedUrls = []
if (components && !Array.isArray(components)) {
components = [components]
} else if (!components || !Array.isArray(components)) {
components = []
}
// returns an array of objects like this: {component: 'ffmpeg', url: 'https://...'}
if (typeof urls === 'object') {
for (const key in urls) {
if (components.includes(key) && urls[key]) {
remappedUrls.push({
component: key,
url: urls[key]
})
}
}
}
async function extractZipToDestination(zipFilename) {
const oldpath = path.join(destinationDir, zipFilename)
const zip = new StreamZip.async({ file: oldpath })
const count = await zip.extract(null, destinationDir)
await zip.close()
}
await async.each(remappedUrls, async function (urlObject) {
try {
const url = urlObject.url
const zipFilename = url.split('/').pop()
const binFilenameBase = urlObject.component
const binFilename = getBinaryFilename(binFilenameBase, opts.platform || detectPlatform())
let runningTotal = 0
let totalFilesize
let interval
if (typeof opts.tickerFn === 'function') {
opts.tickerInterval = parseInt(opts.tickerInterval, 10)
const tickerInterval = (!Number.isNaN(opts.tickerInterval)) ? opts.tickerInterval : 1000
const tickData = { filename: zipFilename, progress: 0 }
// Schedule next ticks
interval = setInterval(function () {
if (totalFilesize && runningTotal == totalFilesize) {
return clearInterval(interval)
}
tickData.progress = totalFilesize > -1 ? runningTotal / totalFilesize : 0
opts.tickerFn(tickData)
}, tickerInterval)
}
// Check if file already exists in target directory
const binPath = path.join(destinationDir, binFilename)
if (!opts.force && await fse.pathExists(binPath)) {
// if the accessSync method doesn't throw we know the binary already exists
results.push({
filename: binFilename,
path: destinationDir,
status: 'File exists',
code: 'FILE_EXISTS'
})
clearInterval(interval)
return
}
if (opts.quiet) clearInterval(interval)
const zipPath = path.join(destinationDir, zipFilename)
const zipFileTempName = zipPath + '.part'
const zipFileFinalName = zipPath
const response = await axios({
url,
method: 'GET',
responseType: 'stream'
})
totalFilesize = response.headers?.['content-length'] || []
const writer = fse.createWriteStream(zipFileTempName)
response.data.on('data', (chunk) => {
runningTotal += chunk.length
})
response.data.pipe(writer)
await finished(writer)
await fse.rename(zipFileTempName, zipFileFinalName)
await extractZipToDestination(zipFilename)
await fse.remove(zipFileFinalName)
results.push({
filename: binFilename,
path: destinationDir,
size: Math.floor(totalFilesize / 1024 / 1024 * 1000) / 1000 + 'MB',
status: 'File extracted to destination (downloaded from "' + url + '")',
code: 'DONE_CLEAN'
})
} catch (err) {
console.error(`Failed to download or extract file for component: ${urlObject.component}`, err)
}
})
return results
}
/**
* Gets binaries for the platform
* It will get the data from ffbinaries, pick the correct files
* and save it to the specified directory
*
* @param {Array} components
* @param {Object} [opts]
*/
async function downloadBinaries(components, opts = {}) {
var platform = resolvePlatform(opts.platform) || detectPlatform()
opts.destination = path.resolve(opts.destination || '.')
ensureDirSync(opts.destination)
const versionData = await getVersionData(opts.version)
const urls = versionData?.bin?.[platform]
if (!urls) {
throw new Error('No URLs!')
}
return await downloadUrls(components, urls, opts)
}
module.exports = {
downloadBinaries: downloadBinaries,
getVersionData: getVersionData,
listVersions: listVersions,
listPlatforms: listPlatforms,
detectPlatform: detectPlatform,
resolvePlatform: resolvePlatform,
getBinaryFilename: getBinaryFilename
}

View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2018 ნიკა
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,262 @@
/**
* Modified from https://github.com/nika-begiashvili/libarchivejs
*/
const Path = require('path')
const { Worker } = require('worker_threads')
/**
* Represents compressed file before extraction
*/
class CompressedFile {
constructor(name, size, path, archiveRef) {
this._name = name
this._size = size
this._path = path
this._archiveRef = archiveRef
}
/**
* file name
*/
get name() {
return this._name
}
/**
* file size
*/
get size() {
return this._size
}
/**
* Extract file from archive
* @returns {Promise<File>} extracted file
*/
extract() {
return this._archiveRef.extractSingleFile(this._path)
}
}
class Archive {
/**
* Creates new archive instance from browser native File object
* @param {Buffer} fileBuffer
* @param {object} options
* @returns {Archive}
*/
static open(fileBuffer) {
const arch = new Archive(fileBuffer, { workerUrl: Path.join(__dirname, 'libarchiveWorker.js') })
return arch.open()
}
/**
* Create new archive
* @param {File} file
* @param {Object} options
*/
constructor(file, options) {
this._worker = new Worker(options.workerUrl)
this._worker.on('message', this._workerMsg.bind(this))
this._callbacks = []
this._content = {}
this._processed = 0
this._file = file
}
/**
* Prepares file for reading
* @returns {Promise<Archive>} archive instance
*/
async open() {
await this._postMessage({ type: 'HELLO' }, (resolve, reject, msg) => {
if (msg.type === 'READY') {
resolve()
}
})
return await this._postMessage({ type: 'OPEN', file: this._file }, (resolve, reject, msg) => {
if (msg.type === 'OPENED') {
resolve(this)
}
})
}
/**
* Terminate worker to free up memory
*/
close() {
this._worker.terminate()
this._worker = null
}
/**
* detect if archive has encrypted data
* @returns {boolean|null} null if could not be determined
*/
hasEncryptedData() {
return this._postMessage({ type: 'CHECK_ENCRYPTION' },
(resolve, reject, msg) => {
if (msg.type === 'ENCRYPTION_STATUS') {
resolve(msg.status)
}
}
)
}
/**
* set password to be used when reading archive
*/
usePassword(archivePassword) {
return this._postMessage({ type: 'SET_PASSPHRASE', passphrase: archivePassword },
(resolve, reject, msg) => {
if (msg.type === 'PASSPHRASE_STATUS') {
resolve(msg.status)
}
}
)
}
/**
* Returns object containing directory structure and file information
* @returns {Promise<object>}
*/
getFilesObject() {
if (this._processed > 0) {
return Promise.resolve().then(() => this._content)
}
return this._postMessage({ type: 'LIST_FILES' }, (resolve, reject, msg) => {
if (msg.type === 'ENTRY') {
const entry = msg.entry
const [target, prop] = this._getProp(this._content, entry.path)
if (entry.type === 'FILE') {
target[prop] = new CompressedFile(entry.fileName, entry.size, entry.path, this)
}
return true
} else if (msg.type === 'END') {
this._processed = 1
resolve(this._cloneContent(this._content))
}
})
}
getFilesArray() {
return this.getFilesObject().then((obj) => {
return this._objectToArray(obj)
})
}
extractSingleFile(target) {
// Prevent extraction if worker already terminated
if (this._worker === null) {
throw new Error("Archive already closed")
}
return this._postMessage({ type: 'EXTRACT_SINGLE_FILE', target: target },
(resolve, reject, msg) => {
if (msg.type === 'FILE') {
resolve(msg.entry)
}
}
)
}
/**
* Returns object containing directory structure and extracted File objects
* @param {Function} extractCallback
*
*/
extractFiles(extractCallback) {
if (this._processed > 1) {
return Promise.resolve().then(() => this._content)
}
return this._postMessage({ type: 'EXTRACT_FILES' }, (resolve, reject, msg) => {
if (msg.type === 'ENTRY') {
const [target, prop] = this._getProp(this._content, msg.entry.path)
if (msg.entry.type === 'FILE') {
target[prop] = msg.entry
if (extractCallback !== undefined) {
setTimeout(extractCallback.bind(null, {
file: target[prop],
path: msg.entry.path,
}))
}
}
return true
} else if (msg.type === 'END') {
this._processed = 2
this._worker.terminate()
resolve(this._cloneContent(this._content))
}
})
}
_cloneContent(obj) {
if (obj instanceof CompressedFile || obj === null) return obj
const o = {}
for (const prop of Object.keys(obj)) {
o[prop] = this._cloneContent(obj[prop])
}
return o
}
_objectToArray(obj, path = '') {
const files = []
for (const key of Object.keys(obj)) {
if (obj[key] instanceof CompressedFile || obj[key] === null) {
files.push({
file: obj[key] || key,
path: path
})
} else {
files.push(...this._objectToArray(obj[key], `${path}${key}/`))
}
}
return files
}
_getProp(obj, path) {
const parts = path.split('/')
if (parts[parts.length - 1] === '') parts.pop()
let cur = obj, prev = null
for (const part of parts) {
cur[part] = cur[part] || {}
prev = cur
cur = cur[part]
}
return [prev, parts[parts.length - 1]]
}
_postMessage(msg, callback) {
this._worker.postMessage(msg)
return new Promise((resolve, reject) => {
this._callbacks.push(this._msgHandler.bind(this, callback, resolve, reject))
})
}
_msgHandler(callback, resolve, reject, msg) {
if (!msg) {
reject('invalid msg')
return
}
if (msg.type === 'BUSY') {
reject('worker is busy')
} else if (msg.type === 'ERROR') {
reject(msg.error)
} else {
return callback(resolve, reject, msg)
}
}
_workerMsg(msg) {
const callback = this._callbacks[this._callbacks.length - 1]
const next = callback(msg)
if (!next) {
this._callbacks.pop()
}
}
}
module.exports = Archive

View File

@@ -0,0 +1,72 @@
/**
* Modified from https://github.com/nika-begiashvili/libarchivejs
*/
const { parentPort } = require('worker_threads')
const { getArchiveReader } = require('./wasm-module')
let reader = null
let busy = false
getArchiveReader((_reader) => {
reader = _reader
busy = false
parentPort.postMessage({ type: 'READY' })
})
parentPort.on('message', async msg => {
if (busy) {
parentPort.postMessage({ type: 'BUSY' })
return
}
let skipExtraction = false
busy = true
try {
switch (msg.type) {
case 'HELLO': // module will respond READY when it's ready
break
case 'OPEN':
await reader.open(msg.file)
parentPort.postMessage({ type: 'OPENED' })
break
case 'LIST_FILES':
skipExtraction = true
// eslint-disable-next-line no-fallthrough
case 'EXTRACT_FILES':
for (const entry of reader.entries(skipExtraction)) {
parentPort.postMessage({ type: 'ENTRY', entry })
}
parentPort.postMessage({ type: 'END' })
break
case 'EXTRACT_SINGLE_FILE':
for (const entry of reader.entries(true, msg.target)) {
if (entry.fileData) {
parentPort.postMessage({ type: 'FILE', entry })
}
}
break
case 'CHECK_ENCRYPTION':
parentPort.postMessage({ type: 'ENCRYPTION_STATUS', status: reader.hasEncryptedData() })
break
case 'SET_PASSPHRASE':
reader.setPassphrase(msg.passphrase)
parentPort.postMessage({ type: 'PASSPHRASE_STATUS', status: true })
break
default:
throw new Error('Invalid Command')
}
} catch (err) {
parentPort.postMessage({
type: 'ERROR',
error: {
message: err.message,
name: err.name,
stack: err.stack
}
})
} finally {
// eslint-disable-next-line require-atomic-updates
busy = false
}
})

View File

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,235 @@
/**
* Modified from https://github.com/nika-begiashvili/libarchivejs
*/
const Path = require('path')
const libarchive = require('./wasm-libarchive')
const TYPE_MAP = {
32768: 'FILE',
16384: 'DIR',
40960: 'SYMBOLIC_LINK',
49152: 'SOCKET',
8192: 'CHARACTER_DEVICE',
24576: 'BLOCK_DEVICE',
4096: 'NAMED_PIPE',
}
class ArchiveReader {
/**
* archive reader
* @param {WasmModule} wasmModule emscripten module
*/
constructor(wasmModule) {
this._wasmModule = wasmModule
this._runCode = wasmModule.runCode
this._file = null
this._passphrase = null
}
/**
* open archive, needs to closed manually
* @param {File} file
*/
open(file) {
if (this._file !== null) {
console.warn('Closing previous file')
this.close()
}
const { promise, resolve, reject } = this._promiseHandles()
this._file = file
this._loadFile(file, resolve, reject)
return promise
}
/**
* close archive
*/
close() {
this._runCode.closeArchive(this._archive)
this._wasmModule._free(this._filePtr)
this._file = null
this._filePtr = null
this._archive = null
}
/**
* detect if archive has encrypted data
* @returns {boolean|null} null if could not be determined
*/
hasEncryptedData() {
this._archive = this._runCode.openArchive(this._filePtr, this._fileLength, this._passphrase)
this._runCode.getNextEntry(this._archive)
const status = this._runCode.hasEncryptedEntries(this._archive)
if (status === 0) {
return false
} else if (status > 0) {
return true
} else {
return null
}
}
/**
* set passphrase to be used with archive
* @param {*} passphrase
*/
setPassphrase(passphrase) {
this._passphrase = passphrase
}
/**
* get archive entries
* @param {boolean} skipExtraction
* @param {string} except don't skip this entry
*/
*entries(skipExtraction = false, except = null) {
this._archive = this._runCode.openArchive(this._filePtr, this._fileLength, this._passphrase)
let entry
while (true) {
entry = this._runCode.getNextEntry(this._archive)
if (entry === 0) break
const entryData = {
size: this._runCode.getEntrySize(entry),
path: this._runCode.getEntryName(entry),
type: TYPE_MAP[this._runCode.getEntryType(entry)],
ref: entry,
}
if (entryData.type === 'FILE') {
let fileName = entryData.path.split('/')
entryData.fileName = fileName[fileName.length - 1]
}
if (skipExtraction && except !== entryData.path) {
this._runCode.skipEntry(this._archive)
} else {
const ptr = this._runCode.getFileData(this._archive, entryData.size)
if (ptr < 0) {
throw new Error(this._runCode.getError(this._archive))
}
entryData.fileData = this._wasmModule.HEAP8.slice(ptr, ptr + entryData.size)
this._wasmModule._free(ptr)
}
yield entryData
}
}
_loadFile(fileBuffer, resolve, reject) {
try {
const array = new Uint8Array(fileBuffer)
this._fileLength = array.length
this._filePtr = this._runCode.malloc(this._fileLength)
this._wasmModule.HEAP8.set(array, this._filePtr)
resolve()
} catch (error) {
reject(error)
}
}
_promiseHandles() {
let resolve = null, reject = null
const promise = new Promise((_resolve, _reject) => {
resolve = _resolve
reject = _reject
})
return { promise, resolve, reject }
}
}
class WasmModule {
constructor() {
this.preRun = []
this.postRun = []
this.totalDependencies = 0
}
print(...text) {
console.log(text)
}
printErr(...text) {
console.error(text)
}
initFunctions() {
this.runCode = {
// const char * get_version()
getVersion: this.cwrap('get_version', 'string', []),
// void * archive_open( const void * buffer, size_t buffer_size)
// retuns archive pointer
openArchive: this.cwrap('archive_open', 'number', ['number', 'number', 'string']),
// void * get_entry(void * archive)
// return archive entry pointer
getNextEntry: this.cwrap('get_next_entry', 'number', ['number']),
// void * get_filedata( void * archive, size_t bufferSize )
getFileData: this.cwrap('get_filedata', 'number', ['number', 'number']),
// int archive_read_data_skip(struct archive *_a)
skipEntry: this.cwrap('archive_read_data_skip', 'number', ['number']),
// void archive_close( void * archive )
closeArchive: this.cwrap('archive_close', null, ['number']),
// la_int64_t archive_entry_size( struct archive_entry * )
getEntrySize: this.cwrap('archive_entry_size', 'number', ['number']),
// const char * archive_entry_pathname( struct archive_entry * )
getEntryName: this.cwrap('archive_entry_pathname', 'string', ['number']),
// __LA_MODE_T archive_entry_filetype( struct archive_entry * )
/*
#define AE_IFMT ((__LA_MODE_T)0170000)
#define AE_IFREG ((__LA_MODE_T)0100000) // Regular file
#define AE_IFLNK ((__LA_MODE_T)0120000) // Sybolic link
#define AE_IFSOCK ((__LA_MODE_T)0140000) // Socket
#define AE_IFCHR ((__LA_MODE_T)0020000) // Character device
#define AE_IFBLK ((__LA_MODE_T)0060000) // Block device
#define AE_IFDIR ((__LA_MODE_T)0040000) // Directory
#define AE_IFIFO ((__LA_MODE_T)0010000) // Named pipe
*/
getEntryType: this.cwrap('archive_entry_filetype', 'number', ['number']),
// const char * archive_error_string(struct archive *);
getError: this.cwrap('archive_error_string', 'string', ['number']),
/*
* Returns 1 if the archive contains at least one encrypted entry.
* If the archive format not support encryption at all
* ARCHIVE_READ_FORMAT_ENCRYPTION_UNSUPPORTED is returned.
* If for any other reason (e.g. not enough data read so far)
* we cannot say whether there are encrypted entries, then
* ARCHIVE_READ_FORMAT_ENCRYPTION_DONT_KNOW is returned.
* In general, this function will return values below zero when the
* reader is uncertain or totally incapable of encryption support.
* When this function returns 0 you can be sure that the reader
* supports encryption detection but no encrypted entries have
* been found yet.
*
* NOTE: If the metadata/header of an archive is also encrypted, you
* cannot rely on the number of encrypted entries. That is why this
* function does not return the number of encrypted entries but#
* just shows that there are some.
*/
// __LA_DECL int archive_read_has_encrypted_entries(struct archive *);
entryIsEncrypted: this.cwrap('archive_entry_is_encrypted', 'number', ['number']),
hasEncryptedEntries: this.cwrap('archive_read_has_encrypted_entries', 'number', ['number']),
// __LA_DECL int archive_read_add_passphrase(struct archive *, const char *);
addPassphrase: this.cwrap('archive_read_add_passphrase', 'number', ['number', 'string']),
//this.stringToUTF(str), //
string: (str) => this.allocate(this.intArrayFromString(str), 'i8', 0),
malloc: this.cwrap('malloc', 'number', ['number']),
free: this.cwrap('free', null, ['number']),
}
}
monitorRunDependencies() { }
locateFile(path /* ,prefix */) {
const wasmFilepath = Path.join(__dirname, `../../../client/dist/libarchive/wasm-gen/${path}`)
return wasmFilepath
}
}
module.exports.getArchiveReader = (cb) => {
libarchive(new WasmModule()).then((module) => {
module.initFunctions()
cb(new ArchiveReader(module))
})
}

View File

@@ -0,0 +1,74 @@
const path = require('path')
const which = require('../libs/which')
const fs = require('../libs/fsExtra')
const ffbinaries = require('../libs/ffbinaries')
const Logger = require('../Logger')
const fileUtils = require('../utils/fileUtils')
class BinaryManager {
defaultRequiredBinaries = [
{ name: 'ffmpeg', envVariable: 'FFMPEG_PATH' },
{ name: 'ffprobe', envVariable: 'FFPROBE_PATH' }
]
constructor(requiredBinaries = this.defaultRequiredBinaries) {
this.requiredBinaries = requiredBinaries
this.mainInstallPath = process.pkg ? path.dirname(process.execPath) : global.appRoot
this.altInstallPath = global.ConfigPath
}
async init() {
if (this.initialized) return
const missingBinaries = await this.findRequiredBinaries()
if (missingBinaries.length == 0) return
await this.install(missingBinaries)
const missingBinariesAfterInstall = await this.findRequiredBinaries()
if (missingBinariesAfterInstall.length != 0) {
Logger.error(`[BinaryManager] Failed to find or install required binaries: ${missingBinariesAfterInstall.join(', ')}`)
process.exit(1)
}
this.initialized = true
}
async findRequiredBinaries() {
const missingBinaries = []
for (const binary of this.requiredBinaries) {
const binaryPath = await this.findBinary(binary.name, binary.envVariable)
if (binaryPath) {
Logger.info(`[BinaryManager] Found ${binary.name} at ${binaryPath}`)
if (process.env[binary.envVariable] !== binaryPath) {
Logger.info(`[BinaryManager] Updating process.env.${binary.envVariable}`)
process.env[binary.envVariable] = binaryPath
}
} else {
Logger.info(`[BinaryManager] ${binary.name} not found`)
missingBinaries.push(binary.name)
}
}
return missingBinaries
}
async findBinary(name, envVariable) {
const executable = name + (process.platform == 'win32' ? '.exe' : '')
const defaultPath = process.env[envVariable]
if (defaultPath && await fs.pathExists(defaultPath)) return defaultPath
const whichPath = which.sync(executable, { nothrow: true })
if (whichPath) return whichPath
const mainInstallPath = path.join(this.mainInstallPath, executable)
if (await fs.pathExists(mainInstallPath)) return mainInstallPath
const altInstallPath = path.join(this.altInstallPath, executable)
if (await fs.pathExists(altInstallPath)) return altInstallPath
return null
}
async install(binaries) {
if (binaries.length == 0) return
Logger.info(`[BinaryManager] Installing binaries: ${binaries.join(', ')}`)
let destination = await fileUtils.isWritable(this.mainInstallPath) ? this.mainInstallPath : this.altInstallPath
await ffbinaries.downloadBinaries(binaries, { destination })
Logger.info(`[BinaryManager] Binaries installed to ${destination}`)
}
}
module.exports = BinaryManager

View File

@@ -7,6 +7,8 @@ const imageType = require('../libs/imageType')
const globals = require('../utils/globals')
const { downloadImageFile, filePathToPOSIX, checkPathIsFile } = require('../utils/fileUtils')
const { extractCoverArt } = require('../utils/ffmpegHelpers')
const parseEbookMetadata = require('../utils/parsers/parseEbookMetadata')
const CacheManager = require('../managers/CacheManager')
class CoverManager {
@@ -234,6 +236,7 @@ class CoverManager {
/**
* Extract cover art from audio file and save for library item
*
* @param {import('../models/Book').AudioFileObject[]} audioFiles
* @param {string} libraryItemId
* @param {string} [libraryItemPath] null for isFile library items
@@ -268,6 +271,44 @@ class CoverManager {
return null
}
/**
* Extract cover art from ebook and save for library item
*
* @param {import('../utils/parsers/parseEbookMetadata').EBookFileScanData} ebookFileScanData
* @param {string} libraryItemId
* @param {string} [libraryItemPath] null for isFile library items
* @returns {Promise<string>} returns cover path
*/
async saveEbookCoverArt(ebookFileScanData, libraryItemId, libraryItemPath) {
if (!ebookFileScanData?.ebookCoverPath) return null
let coverDirPath = null
if (global.ServerSettings.storeCoverWithItem && libraryItemPath) {
coverDirPath = libraryItemPath
} else {
coverDirPath = Path.posix.join(global.MetadataPath, 'items', libraryItemId)
}
await fs.ensureDir(coverDirPath)
let extname = Path.extname(ebookFileScanData.ebookCoverPath) || '.jpg'
if (extname === '.jpeg') extname = '.jpg'
const coverFilename = `cover${extname}`
const coverFilePath = Path.join(coverDirPath, coverFilename)
// TODO: Overwrite if exists?
const coverAlreadyExists = await fs.pathExists(coverFilePath)
if (coverAlreadyExists) {
Logger.warn(`[CoverManager] Extract embedded cover art but cover already exists for "${coverFilePath}" - overwriting`)
}
const success = await parseEbookMetadata.extractCoverImage(ebookFileScanData, coverFilePath)
if (success) {
await CacheManager.purgeCoverCache(libraryItemId)
return coverFilePath
}
return null
}
/**
*
* @param {string} url

View File

@@ -18,6 +18,19 @@ const Logger = require('../Logger')
* @property {string} title
*/
/**
* @typedef SeriesExpandedProperties
* @property {{sequence:string}} bookSeries
*
* @typedef {import('./Series') & SeriesExpandedProperties} SeriesExpanded
*
* @typedef BookExpandedProperties
* @property {import('./Author')[]} authors
* @property {SeriesExpanded[]} series
*
* @typedef {Book & BookExpandedProperties} BookExpanded
*/
/**
* @typedef AudioFileObject
* @property {number} index
@@ -54,6 +67,8 @@ class Book extends Model {
/** @type {string} */
this.titleIgnorePrefix
/** @type {string} */
this.subtitle
/** @type {string} */
this.publishedYear
/** @type {string} */
this.publishedDate

View File

@@ -233,7 +233,7 @@ class Library extends Model {
for (let i = 0; i < libraries.length; i++) {
const library = libraries[i]
if (library.displayOrder !== i + 1) {
Logger.dev(`[Library] Updating display order of library from ${library.displayOrder} to ${i + 1}`)
Logger.debug(`[Library] Updating display order of library from ${library.displayOrder} to ${i + 1}`)
await library.update({ displayOrder: i + 1 }).catch((error) => {
Logger.error(`[Library] Failed to update library display order to ${i + 1}`, error)
})

View File

@@ -15,6 +15,13 @@ const Podcast = require('./Podcast')
* @property {{filename:string, ext:string, path:string, relPath:string, size:number, mtimeMs:number, ctimeMs:number, birthtimeMs:number}} metadata
*/
/**
* @typedef LibraryItemExpandedProperties
* @property {Book.BookExpanded|Podcast.PodcastExpanded} media
*
* @typedef {LibraryItem & LibraryItemExpandedProperties} LibraryItemExpanded
*/
class LibraryItem extends Model {
constructor(values, options) {
super(values, options)
@@ -264,7 +271,7 @@ class LibraryItem extends Model {
for (const existingPodcastEpisode of existingPodcastEpisodes) {
// Episode was removed
if (!updatedPodcastEpisodes.some(ep => ep.id === existingPodcastEpisode.id)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${existingPodcastEpisode.title}" was removed`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${existingPodcastEpisode.title}" was removed`)
await existingPodcastEpisode.destroy()
hasUpdates = true
}
@@ -272,7 +279,7 @@ class LibraryItem extends Model {
for (const updatedPodcastEpisode of updatedPodcastEpisodes) {
const existingEpisodeMatch = existingPodcastEpisodes.find(ep => ep.id === updatedPodcastEpisode.id)
if (!existingEpisodeMatch) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${updatedPodcastEpisode.title}" was added`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${updatedPodcastEpisode.title}" was added`)
await this.sequelize.models.podcastEpisode.createFromOld(updatedPodcastEpisode)
hasUpdates = true
} else {
@@ -283,7 +290,7 @@ class LibraryItem extends Model {
if (existingValue instanceof Date) existingValue = existingValue.valueOf()
if (!areEquivalent(updatedEpisodeCleaned[key], existingValue, true)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${existingEpisodeMatch.title}" ${key} was updated from "${existingValue}" to "${updatedEpisodeCleaned[key]}"`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" episode "${existingEpisodeMatch.title}" ${key} was updated from "${existingValue}" to "${updatedEpisodeCleaned[key]}"`)
episodeHasUpdates = true
}
}
@@ -304,7 +311,7 @@ class LibraryItem extends Model {
for (const existingAuthor of existingAuthors) {
// Author was removed from Book
if (!updatedAuthors.some(au => au.id === existingAuthor.id)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" author "${existingAuthor.name}" was removed`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" author "${existingAuthor.name}" was removed`)
await this.sequelize.models.bookAuthor.removeByIds(existingAuthor.id, libraryItemExpanded.media.id)
hasUpdates = true
}
@@ -312,7 +319,7 @@ class LibraryItem extends Model {
for (const updatedAuthor of updatedAuthors) {
// Author was added
if (!existingAuthors.some(au => au.id === updatedAuthor.id)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" author "${updatedAuthor.name}" was added`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" author "${updatedAuthor.name}" was added`)
await this.sequelize.models.bookAuthor.create({ authorId: updatedAuthor.id, bookId: libraryItemExpanded.media.id })
hasUpdates = true
}
@@ -320,7 +327,7 @@ class LibraryItem extends Model {
for (const existingSeries of existingSeriesAll) {
// Series was removed
if (!updatedSeriesAll.some(se => se.id === existingSeries.id)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${existingSeries.name}" was removed`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${existingSeries.name}" was removed`)
await this.sequelize.models.bookSeries.removeByIds(existingSeries.id, libraryItemExpanded.media.id)
hasUpdates = true
}
@@ -329,11 +336,11 @@ class LibraryItem extends Model {
// Series was added/updated
const existingSeriesMatch = existingSeriesAll.find(se => se.id === updatedSeries.id)
if (!existingSeriesMatch) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${updatedSeries.name}" was added`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${updatedSeries.name}" was added`)
await this.sequelize.models.bookSeries.create({ seriesId: updatedSeries.id, bookId: libraryItemExpanded.media.id, sequence: updatedSeries.sequence })
hasUpdates = true
} else if (existingSeriesMatch.bookSeries.sequence !== updatedSeries.sequence) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${updatedSeries.name}" sequence was updated from "${existingSeriesMatch.bookSeries.sequence}" to "${updatedSeries.sequence}"`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" series "${updatedSeries.name}" sequence was updated from "${existingSeriesMatch.bookSeries.sequence}" to "${updatedSeries.sequence}"`)
await existingSeriesMatch.bookSeries.update({ id: updatedSeries.id, sequence: updatedSeries.sequence })
hasUpdates = true
}
@@ -346,7 +353,7 @@ class LibraryItem extends Model {
if (existingValue instanceof Date) existingValue = existingValue.valueOf()
if (!areEquivalent(updatedMedia[key], existingValue, true)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" ${libraryItemExpanded.mediaType}.${key} updated from ${existingValue} to ${updatedMedia[key]}`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" ${libraryItemExpanded.mediaType}.${key} updated from ${existingValue} to ${updatedMedia[key]}`)
hasMediaUpdates = true
}
}
@@ -363,7 +370,7 @@ class LibraryItem extends Model {
if (existingValue instanceof Date) existingValue = existingValue.valueOf()
if (!areEquivalent(updatedLibraryItem[key], existingValue, true)) {
Logger.dev(`[LibraryItem] "${libraryItemExpanded.media.title}" ${key} updated from ${existingValue} to ${updatedLibraryItem[key]}`)
Logger.debug(`[LibraryItem] "${libraryItemExpanded.media.title}" ${key} updated from ${existingValue} to ${updatedLibraryItem[key]}`)
hasLibraryItemUpdates = true
}
}
@@ -412,6 +419,55 @@ class LibraryItem extends Model {
})
}
/**
*
* @param {string} libraryItemId
* @returns {Promise<LibraryItemExpanded>}
*/
static async getExpandedById(libraryItemId) {
if (!libraryItemId) return null
const libraryItem = await this.findByPk(libraryItemId)
if (!libraryItem) {
Logger.error(`[LibraryItem] Library item not found with id "${libraryItemId}"`)
return null
}
if (libraryItem.mediaType === 'podcast') {
libraryItem.media = await libraryItem.getMedia({
include: [
{
model: this.sequelize.models.podcastEpisode
}
]
})
} else {
libraryItem.media = await libraryItem.getMedia({
include: [
{
model: this.sequelize.models.author,
through: {
attributes: []
}
},
{
model: this.sequelize.models.series,
through: {
attributes: ['sequence']
}
}
],
order: [
[this.sequelize.models.author, this.sequelize.models.bookAuthor, 'createdAt', 'ASC'],
[this.sequelize.models.series, 'bookSeries', 'createdAt', 'ASC']
]
})
}
if (!libraryItem.media) return null
return libraryItem
}
/**
* Get old library item by id
* @param {string} libraryItemId
@@ -419,40 +475,45 @@ class LibraryItem extends Model {
*/
static async getOldById(libraryItemId) {
if (!libraryItemId) return null
const libraryItem = await this.findByPk(libraryItemId, {
include: [
{
model: this.sequelize.models.book,
include: [
{
model: this.sequelize.models.author,
through: {
attributes: []
}
},
{
model: this.sequelize.models.series,
through: {
attributes: ['sequence']
}
const libraryItem = await this.findByPk(libraryItemId)
if (!libraryItem) {
Logger.error(`[LibraryItem] Library item not found with id "${libraryItemId}"`)
return null
}
if (libraryItem.mediaType === 'podcast') {
libraryItem.media = await libraryItem.getMedia({
include: [
{
model: this.sequelize.models.podcastEpisode
}
]
})
} else {
libraryItem.media = await libraryItem.getMedia({
include: [
{
model: this.sequelize.models.author,
through: {
attributes: []
}
]
},
{
model: this.sequelize.models.podcast,
include: [
{
model: this.sequelize.models.podcastEpisode
},
{
model: this.sequelize.models.series,
through: {
attributes: ['sequence']
}
]
}
],
order: [
[this.sequelize.models.book, this.sequelize.models.author, this.sequelize.models.bookAuthor, 'createdAt', 'ASC'],
[this.sequelize.models.book, this.sequelize.models.series, 'bookSeries', 'createdAt', 'ASC']
]
})
if (!libraryItem) return null
}
],
order: [
[this.sequelize.models.author, this.sequelize.models.bookAuthor, 'createdAt', 'ASC'],
[this.sequelize.models.series, 'bookSeries', 'createdAt', 'ASC']
]
})
}
if (!libraryItem.media) return null
return this.getOldLibraryItem(libraryItem)
}
@@ -536,7 +597,7 @@ class LibraryItem extends Model {
})
}
}
Logger.dev(`Loaded ${itemsInProgressPayload.items.length} of ${itemsInProgressPayload.count} items for "Continue Listening/Reading" in ${((Date.now() - fullStart) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${itemsInProgressPayload.items.length} of ${itemsInProgressPayload.count} items for "Continue Listening/Reading" in ${((Date.now() - fullStart) / 1000).toFixed(2)}s`)
let start = Date.now()
if (library.isBook) {
@@ -553,7 +614,7 @@ class LibraryItem extends Model {
total: continueSeriesPayload.count
})
}
Logger.dev(`Loaded ${continueSeriesPayload.libraryItems.length} of ${continueSeriesPayload.count} items for "Continue Series" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${continueSeriesPayload.libraryItems.length} of ${continueSeriesPayload.count} items for "Continue Series" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
} else if (library.isPodcast) {
// "Newest Episodes" shelf
const newestEpisodesPayload = await libraryFilters.getNewestPodcastEpisodes(library, user, limit)
@@ -567,7 +628,7 @@ class LibraryItem extends Model {
total: newestEpisodesPayload.count
})
}
Logger.dev(`Loaded ${newestEpisodesPayload.libraryItems.length} of ${newestEpisodesPayload.count} episodes for "Newest Episodes" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${newestEpisodesPayload.libraryItems.length} of ${newestEpisodesPayload.count} episodes for "Newest Episodes" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
}
start = Date.now()
@@ -583,7 +644,7 @@ class LibraryItem extends Model {
total: mostRecentPayload.count
})
}
Logger.dev(`Loaded ${mostRecentPayload.libraryItems.length} of ${mostRecentPayload.count} items for "Recently Added" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${mostRecentPayload.libraryItems.length} of ${mostRecentPayload.count} items for "Recently Added" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
if (library.isBook) {
start = Date.now()
@@ -599,7 +660,7 @@ class LibraryItem extends Model {
total: seriesMostRecentPayload.count
})
}
Logger.dev(`Loaded ${seriesMostRecentPayload.series.length} of ${seriesMostRecentPayload.count} series for "Recent Series" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${seriesMostRecentPayload.series.length} of ${seriesMostRecentPayload.count} series for "Recent Series" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
start = Date.now()
// "Discover" shelf
@@ -614,7 +675,7 @@ class LibraryItem extends Model {
total: discoverLibraryItemsPayload.count
})
}
Logger.dev(`Loaded ${discoverLibraryItemsPayload.libraryItems.length} of ${discoverLibraryItemsPayload.count} items for "Discover" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${discoverLibraryItemsPayload.libraryItems.length} of ${discoverLibraryItemsPayload.count} items for "Discover" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
}
start = Date.now()
@@ -645,7 +706,7 @@ class LibraryItem extends Model {
})
}
}
Logger.dev(`Loaded ${mediaFinishedPayload.items.length} of ${mediaFinishedPayload.count} items for "Listen/Read Again" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${mediaFinishedPayload.items.length} of ${mediaFinishedPayload.count} items for "Listen/Read Again" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
if (library.isBook) {
start = Date.now()
@@ -661,7 +722,7 @@ class LibraryItem extends Model {
total: newestAuthorsPayload.count
})
}
Logger.dev(`Loaded ${newestAuthorsPayload.authors.length} of ${newestAuthorsPayload.count} authors for "Newest Authors" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
Logger.debug(`Loaded ${newestAuthorsPayload.authors.length} of ${newestAuthorsPayload.count} authors for "Newest Authors" in ${((Date.now() - start) / 1000).toFixed(2)}s`)
}
Logger.debug(`Loaded ${shelves.length} personalized shelves in ${((Date.now() - fullStart) / 1000).toFixed(2)}s`)

View File

@@ -1,5 +1,12 @@
const { DataTypes, Model } = require('sequelize')
/**
* @typedef PodcastExpandedProperties
* @property {import('./PodcastEpisode')[]} podcastEpisodes
*
* @typedef {Podcast & PodcastExpandedProperties} PodcastExpanded
*/
class Podcast extends Model {
constructor(values, options) {
super(values, options)

View File

@@ -152,7 +152,12 @@ class PodcastEpisode extends Model {
extraData: DataTypes.JSON
}, {
sequelize,
modelName: 'podcastEpisode'
modelName: 'podcastEpisode',
indexes: [
{
fields: ['createdAt']
}
]
})
const { podcast } = sequelize.models

View File

@@ -48,12 +48,14 @@ class PodcastEpisode {
this.guid = episode.guid || null
this.pubDate = episode.pubDate
this.chapters = episode.chapters?.map(ch => ({ ...ch })) || []
this.audioFile = new AudioFile(episode.audioFile)
this.audioFile = episode.audioFile ? new AudioFile(episode.audioFile) : null
this.publishedAt = episode.publishedAt
this.addedAt = episode.addedAt
this.updatedAt = episode.updatedAt
this.audioFile.index = 1 // Only 1 audio file per episode
if (this.audioFile) {
this.audioFile.index = 1 // Only 1 audio file per episode
}
}
toJSON() {
@@ -73,7 +75,7 @@ class PodcastEpisode {
guid: this.guid,
pubDate: this.pubDate,
chapters: this.chapters.map(ch => ({ ...ch })),
audioFile: this.audioFile.toJSON(),
audioFile: this.audioFile?.toJSON() || null,
publishedAt: this.publishedAt,
addedAt: this.addedAt,
updatedAt: this.updatedAt
@@ -97,8 +99,8 @@ class PodcastEpisode {
guid: this.guid,
pubDate: this.pubDate,
chapters: this.chapters.map(ch => ({ ...ch })),
audioFile: this.audioFile.toJSON(),
audioTrack: this.audioTrack.toJSON(),
audioFile: this.audioFile?.toJSON() || null,
audioTrack: this.audioTrack?.toJSON() || null,
publishedAt: this.publishedAt,
addedAt: this.addedAt,
updatedAt: this.updatedAt,
@@ -108,6 +110,7 @@ class PodcastEpisode {
}
get audioTrack() {
if (!this.audioFile) return null
const audioTrack = new AudioTrack()
audioTrack.setData(this.libraryItemId, this.audioFile, 0)
return audioTrack
@@ -116,9 +119,9 @@ class PodcastEpisode {
return [this.audioTrack]
}
get duration() {
return this.audioFile.duration
return this.audioFile?.duration || 0
}
get size() { return this.audioFile.metadata.size }
get size() { return this.audioFile?.metadata.size || 0 }
get enclosureUrl() {
return this.enclosure?.url || null
}

View File

@@ -320,35 +320,6 @@ class ApiRouter {
this.router.get('/stats/year/:year', MiscController.getAdminStatsForYear.bind(this))
}
async getDirectories(dir, relpath, excludedDirs, level = 0) {
try {
const paths = await fs.readdir(dir)
let dirs = await Promise.all(paths.map(async dirname => {
const fullPath = Path.join(dir, dirname)
const path = Path.join(relpath, dirname)
const isDir = (await fs.lstat(fullPath)).isDirectory()
if (isDir && !excludedDirs.includes(path) && dirname !== 'node_modules') {
return {
path,
dirname,
fullPath,
level,
dirs: level < 4 ? (await this.getDirectories(fullPath, path, excludedDirs, level + 1)) : []
}
} else {
return false
}
}))
dirs = dirs.filter(d => d)
return dirs
} catch (error) {
Logger.error('Failed to readdir', dir, error)
return []
}
}
//
// Helper Methods
//

View File

@@ -36,6 +36,8 @@ class AbsMetadataFileScanner {
for (const key in abMetadata) {
// TODO: When to override with null or empty arrays?
if (abMetadata[key] === undefined || abMetadata[key] === null) continue
if (key === 'authors' && !abMetadata.authors?.length) continue
if (key === 'genres' && !abMetadata.genres?.length) continue
if (key === 'tags' && !abMetadata.tags?.length) continue
if (key === 'chapters' && !abMetadata.chapters?.length) continue

View File

@@ -468,7 +468,7 @@ class AudioFileScanner {
audioFiles.length === 1 ||
audioFiles.length > 1 &&
audioFiles[0].chapters.length === audioFiles[1].chapters?.length &&
audioFiles[0].chapters.every((c, i) => c.title === audioFiles[1].chapters[i].title)
audioFiles[0].chapters.every((c, i) => c.title === audioFiles[1].chapters[i].title && c.start === audioFiles[1].chapters[i].start)
) {
libraryScan.addLog(LogLevel.DEBUG, `setChapters: Using embedded chapters in first audio file ${audioFiles[0].metadata?.path}`)
chapters = audioFiles[0].chapters.map((c) => ({ ...c }))

View File

@@ -3,8 +3,8 @@ const Path = require('path')
const sequelize = require('sequelize')
const { LogLevel } = require('../utils/constants')
const { getTitleIgnorePrefix, areEquivalent } = require('../utils/index')
const abmetadataGenerator = require('../utils/generators/abmetadataGenerator')
const parseNameString = require('../utils/parsers/parseNameString')
const parseEbookMetadata = require('../utils/parsers/parseEbookMetadata')
const globals = require('../utils/globals')
const AudioFileScanner = require('./AudioFileScanner')
const Database = require('../Database')
@@ -170,7 +170,9 @@ class BookScanner {
hasMediaChanges = true
}
const bookMetadata = await this.getBookMetadataFromScanData(media.audioFiles, libraryItemData, libraryScan, librarySettings, existingLibraryItem.id)
const ebookFileScanData = await parseEbookMetadata.parse(media.ebookFile)
const bookMetadata = await this.getBookMetadataFromScanData(media.audioFiles, ebookFileScanData, libraryItemData, libraryScan, librarySettings, existingLibraryItem.id)
let authorsUpdated = false
const bookAuthorsRemoved = []
let seriesUpdated = false
@@ -217,7 +219,8 @@ class BookScanner {
} else if (key === 'series') {
// Check for series added
for (const seriesObj of bookMetadata.series) {
if (!media.series.some(se => se.name === seriesObj.name)) {
const existingBookSeries = media.series.find(se => se.name === seriesObj.name)
if (!existingBookSeries) {
const existingSeries = Database.libraryFilterData[libraryItemData.libraryId].series.find(se => se.name === seriesObj.name)
if (existingSeries) {
await Database.bookSeriesModel.create({
@@ -238,6 +241,11 @@ class BookScanner {
libraryScan.addLog(LogLevel.DEBUG, `Updating book "${bookMetadata.title}" added new series "${seriesObj.name}"${seriesObj.sequence ? ` with sequence "${seriesObj.sequence}"` : ''}`)
seriesUpdated = true
}
} else if (seriesObj.sequence && existingBookSeries.bookSeries.sequence !== seriesObj.sequence) {
libraryScan.addLog(LogLevel.DEBUG, `Updating book "${bookMetadata.title}" series "${seriesObj.name}" sequence "${existingBookSeries.bookSeries.sequence || ''}" => "${seriesObj.sequence}"`)
seriesUpdated = true
existingBookSeries.bookSeries.sequence = seriesObj.sequence
await existingBookSeries.bookSeries.save()
}
}
// Check for series removed
@@ -311,24 +319,34 @@ class BookScanner {
})
}
// If no cover then extract cover from audio file if available OR search for cover if enabled in server settings
// If no cover then extract cover from audio file OR from ebook
const libraryItemDir = existingLibraryItem.isFile ? null : existingLibraryItem.path
if (!media.coverPath) {
const libraryItemDir = existingLibraryItem.isFile ? null : existingLibraryItem.path
const extractedCoverPath = await CoverManager.saveEmbeddedCoverArt(media.audioFiles, existingLibraryItem.id, libraryItemDir)
let extractedCoverPath = await CoverManager.saveEmbeddedCoverArt(media.audioFiles, existingLibraryItem.id, libraryItemDir)
if (extractedCoverPath) {
libraryScan.addLog(LogLevel.DEBUG, `Updating book "${bookMetadata.title}" extracted embedded cover art from audio file to path "${extractedCoverPath}"`)
media.coverPath = extractedCoverPath
hasMediaChanges = true
} else if (Database.serverSettings.scannerFindCovers) {
const authorName = media.authors.map(au => au.name).filter(au => au).join(', ')
const coverPath = await this.searchForCover(existingLibraryItem.id, libraryItemDir, media.title, authorName, libraryScan)
if (coverPath) {
media.coverPath = coverPath
} else if (ebookFileScanData?.ebookCoverPath) {
extractedCoverPath = await CoverManager.saveEbookCoverArt(ebookFileScanData, existingLibraryItem.id, libraryItemDir)
if (extractedCoverPath) {
libraryScan.addLog(LogLevel.DEBUG, `Updating book "${bookMetadata.title}" extracted embedded cover art from ebook file to path "${extractedCoverPath}"`)
media.coverPath = extractedCoverPath
hasMediaChanges = true
}
}
}
// If no cover then search for cover if enabled in server settings
if (!media.coverPath && Database.serverSettings.scannerFindCovers) {
const authorName = media.authors.map(au => au.name).filter(au => au).join(', ')
const coverPath = await this.searchForCover(existingLibraryItem.id, libraryItemDir, media.title, authorName, libraryScan)
if (coverPath) {
media.coverPath = coverPath
hasMediaChanges = true
}
}
existingLibraryItem.media = media
let libraryItemUpdated = false
@@ -402,12 +420,14 @@ class BookScanner {
return null
}
let ebookFileScanData = null
if (ebookLibraryFile) {
ebookLibraryFile = ebookLibraryFile.toJSON()
ebookLibraryFile.ebookFormat = ebookLibraryFile.metadata.ext.slice(1).toLowerCase()
ebookFileScanData = await parseEbookMetadata.parse(ebookLibraryFile)
}
const bookMetadata = await this.getBookMetadataFromScanData(scannedAudioFiles, libraryItemData, libraryScan, librarySettings)
const bookMetadata = await this.getBookMetadataFromScanData(scannedAudioFiles, ebookFileScanData, libraryItemData, libraryScan, librarySettings)
bookMetadata.explicit = !!bookMetadata.explicit // Ensure boolean
bookMetadata.abridged = !!bookMetadata.abridged // Ensure boolean
@@ -475,19 +495,28 @@ class BookScanner {
}
}
// If cover was not found in folder then check embedded covers in audio files OR search for cover
// If cover was not found in folder then check embedded covers in audio files OR ebook file
const libraryItemDir = libraryItemObj.isFile ? null : libraryItemObj.path
if (!bookObject.coverPath) {
const libraryItemDir = libraryItemObj.isFile ? null : libraryItemObj.path
// Extract and save embedded cover art
const extractedCoverPath = await CoverManager.saveEmbeddedCoverArt(scannedAudioFiles, libraryItemObj.id, libraryItemDir)
let extractedCoverPath = await CoverManager.saveEmbeddedCoverArt(scannedAudioFiles, libraryItemObj.id, libraryItemDir)
if (extractedCoverPath) {
libraryScan.addLog(LogLevel.DEBUG, `Extracted embedded cover from audio file at "${extractedCoverPath}" for book "${bookObject.title}"`)
bookObject.coverPath = extractedCoverPath
} else if (Database.serverSettings.scannerFindCovers) {
const authorName = bookMetadata.authors.join(', ')
bookObject.coverPath = await this.searchForCover(libraryItemObj.id, libraryItemDir, bookObject.title, authorName, libraryScan)
} else if (ebookFileScanData?.ebookCoverPath) {
extractedCoverPath = await CoverManager.saveEbookCoverArt(ebookFileScanData, libraryItemObj.id, libraryItemDir)
if (extractedCoverPath) {
libraryScan.addLog(LogLevel.DEBUG, `Extracted embedded cover from ebook file at "${extractedCoverPath}" for book "${bookObject.title}"`)
bookObject.coverPath = extractedCoverPath
}
}
}
// If cover not found then search for cover if enabled in settings
if (!bookObject.coverPath && Database.serverSettings.scannerFindCovers) {
const authorName = bookMetadata.authors.join(', ')
bookObject.coverPath = await this.searchForCover(libraryItemObj.id, libraryItemDir, bookObject.title, authorName, libraryScan)
}
libraryItemObj.book = bookObject
const libraryItem = await Database.libraryItemModel.create(libraryItemObj, {
include: {
@@ -564,13 +593,14 @@ class BookScanner {
/**
*
* @param {import('../models/Book').AudioFileObject[]} audioFiles
* @param {import('../utils/parsers/parseEbookMetadata').EBookFileScanData} ebookFileScanData
* @param {import('./LibraryItemScanData')} libraryItemData
* @param {LibraryScan} libraryScan
* @param {import('../models/Library').LibrarySettingsObject} librarySettings
* @param {string} [existingLibraryItemId]
* @returns {Promise<BookMetadataObject>}
*/
async getBookMetadataFromScanData(audioFiles, libraryItemData, libraryScan, librarySettings, existingLibraryItemId = null) {
async getBookMetadataFromScanData(audioFiles, ebookFileScanData, libraryItemData, libraryScan, librarySettings, existingLibraryItemId = null) {
// First set book metadata from folder/file names
const bookMetadata = {
title: libraryItemData.mediaMetadata.title, // required
@@ -593,7 +623,7 @@ class BookScanner {
coverPath: undefined
}
const bookMetadataSourceHandler = new BookScanner.BookMetadataSourceHandler(bookMetadata, audioFiles, libraryItemData, libraryScan, existingLibraryItemId)
const bookMetadataSourceHandler = new BookScanner.BookMetadataSourceHandler(bookMetadata, audioFiles, ebookFileScanData, libraryItemData, libraryScan, existingLibraryItemId)
const metadataPrecedence = librarySettings.metadataPrecedence || ['folderStructure', 'audioMetatags', 'nfoFile', 'txtFiles', 'opfFile', 'absMetadata']
libraryScan.addLog(LogLevel.DEBUG, `"${bookMetadata.title}" Getting metadata with precedence [${metadataPrecedence.join(', ')}]`)
for (const metadataSource of metadataPrecedence) {
@@ -621,13 +651,15 @@ class BookScanner {
*
* @param {Object} bookMetadata
* @param {import('../models/Book').AudioFileObject[]} audioFiles
* @param {import('../utils/parsers/parseEbookMetadata').EBookFileScanData} ebookFileScanData
* @param {import('./LibraryItemScanData')} libraryItemData
* @param {LibraryScan} libraryScan
* @param {string} existingLibraryItemId
*/
constructor(bookMetadata, audioFiles, libraryItemData, libraryScan, existingLibraryItemId) {
constructor(bookMetadata, audioFiles, ebookFileScanData, libraryItemData, libraryScan, existingLibraryItemId) {
this.bookMetadata = bookMetadata
this.audioFiles = audioFiles
this.ebookFileScanData = ebookFileScanData
this.libraryItemData = libraryItemData
this.libraryScan = libraryScan
this.existingLibraryItemId = existingLibraryItemId
@@ -641,13 +673,42 @@ class BookScanner {
}
/**
* Metadata from audio file meta tags
* Metadata from audio file meta tags OR metadata from ebook file
*/
audioMetatags() {
if (!this.audioFiles.length) return
// Modifies bookMetadata with metadata mapped from audio file meta tags
const bookTitle = this.bookMetadata.title || this.libraryItemData.mediaMetadata.title
AudioFileScanner.setBookMetadataFromAudioMetaTags(bookTitle, this.audioFiles, this.bookMetadata, this.libraryScan)
if (this.audioFiles.length) {
// Modifies bookMetadata with metadata mapped from audio file meta tags
const bookTitle = this.bookMetadata.title || this.libraryItemData.mediaMetadata.title
AudioFileScanner.setBookMetadataFromAudioMetaTags(bookTitle, this.audioFiles, this.bookMetadata, this.libraryScan)
} else if (this.ebookFileScanData) {
const ebookMetdataObject = this.ebookFileScanData.metadata || {}
for (const key in ebookMetdataObject) {
if (key === 'tags') {
if (ebookMetdataObject.tags.length) {
this.bookMetadata.tags = ebookMetdataObject.tags
}
} else if (key === 'genres') {
if (ebookMetdataObject.genres.length) {
this.bookMetadata.genres = ebookMetdataObject.genres
}
} else if (key === 'authors') {
if (ebookMetdataObject.authors?.length) {
this.bookMetadata.authors = ebookMetdataObject.authors
}
} else if (key === 'narrators') {
if (ebookMetdataObject.narrators?.length) {
this.bookMetadata.narrators = ebookMetdataObject.narrators
}
} else if (key === 'series') {
if (ebookMetdataObject.series?.length) {
this.bookMetadata.series = ebookMetdataObject.series
}
} else if (ebookMetdataObject[key] && key !== 'sequence') {
this.bookMetadata[key] = ebookMetdataObject[key]
}
}
}
return null
}
/**
@@ -657,7 +718,7 @@ class BookScanner {
if (!this.libraryItemData.metadataNfoLibraryFile) return
await NfoFileScanner.scanBookNfoFile(this.libraryItemData.metadataNfoLibraryFile, this.bookMetadata)
}
/**
* Description from desc.txt and narrator from reader.txt
*/

View File

@@ -32,11 +32,8 @@ class OpfFileScanner {
bookMetadata.narrators = opfMetadata.narrators
}
} else if (key === 'series') {
if (opfMetadata.series) {
bookMetadata.series = [{
name: opfMetadata.series,
sequence: opfMetadata.sequence || null
}]
if (opfMetadata.series?.length) {
bookMetadata.series = opfMetadata.series
}
} else if (opfMetadata[key] && key !== 'sequence') {
bookMetadata[key] = opfMetadata[key]

View File

@@ -2,7 +2,6 @@ const uuidv4 = require("uuid").v4
const Path = require('path')
const { LogLevel } = require('../utils/constants')
const { getTitleIgnorePrefix } = require('../utils/index')
const abmetadataGenerator = require('../utils/generators/abmetadataGenerator')
const AudioFileScanner = require('./AudioFileScanner')
const Database = require('../Database')
const { filePathToPOSIX, getFileTimestampsWithIno } = require('../utils/fileUtils')

View File

@@ -1,6 +1,7 @@
const axios = require('axios')
const Path = require('path')
const ssrfFilter = require('ssrf-req-filter')
const exec = require('child_process').exec
const fs = require('../libs/fsExtra')
const rra = require('../libs/recursiveReaddirAsync')
const Logger = require('../Logger')
@@ -81,7 +82,12 @@ module.exports.getFileSize = async (path) => {
* @returns {Promise<number>} epoch timestamp
*/
module.exports.getFileMTimeMs = async (path) => {
return (await getFileStat(path))?.mtimeMs || 0
try {
return (await getFileStat(path))?.mtimeMs || 0
} catch (err) {
Logger.error(`[fileUtils] Failed to getFileMtimeMs`, err)
return 0
}
}
/**
@@ -354,3 +360,84 @@ module.exports.encodeUriPath = (path) => {
const uri = new URL(path, "file://")
return uri.pathname
}
/**
* Check if directory is writable.
* This method is necessary because fs.access(directory, fs.constants.W_OK) does not work on Windows
*
* @param {string} directory
* @returns {boolean}
*/
module.exports.isWritable = async (directory) => {
try {
const accessTestFile = path.join(directory, 'accessTest')
await fs.writeFile(accessTestFile, '')
await fs.remove(accessTestFile)
return true
} catch (err) {
return false
}
}
/**
* Get Windows drives as array e.g. ["C:/", "F:/"]
*
* @returns {Promise<string[]>}
*/
module.exports.getWindowsDrives = async () => {
if (!global.isWin) {
return []
}
return new Promise((resolve, reject) => {
exec('wmic logicaldisk get name', async (error, stdout, stderr) => {
if (error) {
reject(error)
return
}
let drives = stdout?.split(/\r?\n/).map(line => line.trim()).filter(line => line).slice(1)
const validDrives = []
for (const drive of drives) {
let drivepath = drive + '/'
if (await fs.pathExists(drivepath)) {
validDrives.push(drivepath)
} else {
Logger.error(`Invalid drive ${drivepath}`)
}
}
resolve(validDrives)
})
})
}
/**
* Get array of directory paths in a directory
*
* @param {string} dirPath
* @param {number} level
* @returns {Promise<{ path:string, dirname:string, level:number }[]>}
*/
module.exports.getDirectoriesInPath = async (dirPath, level) => {
try {
const paths = await fs.readdir(dirPath)
let dirs = await Promise.all(paths.map(async dirname => {
const fullPath = Path.join(dirPath, dirname)
const lstat = await fs.lstat(fullPath).catch((error) => {
Logger.debug(`Failed to lstat "${fullPath}"`, error)
return null
})
if (!lstat?.isDirectory()) return null
return {
path: this.filePathToPOSIX(fullPath),
dirname,
level
}
}))
dirs = dirs.filter(d => d)
return dirs
} catch (error) {
Logger.error('Failed to readdir', dirPath, error)
return []
}
}

View File

@@ -1,4 +1,5 @@
const xml = require('../../libs/xml')
const escapeForXML = require('../../libs/xml/escapeForXML')
/**
* Generate OPML file string for podcasts in a library
@@ -12,18 +13,18 @@ module.exports.generate = (podcasts, indent = true) => {
if (!podcast.feedURL) return
const feedAttributes = {
type: 'rss',
text: podcast.title,
title: podcast.title,
xmlUrl: podcast.feedURL
text: escapeForXML(podcast.title),
title: escapeForXML(podcast.title),
xmlUrl: escapeForXML(podcast.feedURL)
}
if (podcast.description) {
feedAttributes.description = podcast.description
feedAttributes.description = escapeForXML(podcast.description)
}
if (podcast.itunesPageUrl) {
feedAttributes.htmlUrl = podcast.itunesPageUrl
feedAttributes.htmlUrl = escapeForXML(podcast.itunesPageUrl)
}
if (podcast.language) {
feedAttributes.language = podcast.language
feedAttributes.language = escapeForXML(podcast.language)
}
bodyItems.push({
outline: {

View File

@@ -0,0 +1,35 @@
/**
* TODO: Add more fields
* @see https://anansi-project.github.io/docs/comicinfo/intro
*
* @param {Object} comicInfoJson
* @returns {import('../../scanner/BookScanner').BookMetadataObject}
*/
module.exports.parse = (comicInfoJson) => {
if (!comicInfoJson?.ComicInfo) return null
const ComicSeries = comicInfoJson.ComicInfo.Series?.[0]?.trim() || null
const ComicNumber = comicInfoJson.ComicInfo.Number?.[0]?.trim() || null
const ComicSummary = comicInfoJson.ComicInfo.Summary?.[0]?.trim() || null
let title = null
const series = []
if (ComicSeries) {
series.push({
name: ComicSeries,
sequence: ComicNumber
})
title = ComicSeries
if (ComicNumber) {
title += ` ${ComicNumber}`
}
}
return {
title,
series,
description: ComicSummary
}
}

View File

@@ -0,0 +1,109 @@
const Path = require('path')
const globals = require('../globals')
const fs = require('../../libs/fsExtra')
const Logger = require('../../Logger')
const Archive = require('../../libs/libarchive/archive')
const { xmlToJSON } = require('../index')
const parseComicInfoMetadata = require('./parseComicInfoMetadata')
/**
*
* @param {string} filepath
* @returns {Promise<Buffer>}
*/
async function getComicFileBuffer(filepath) {
if (!await fs.pathExists(filepath)) {
Logger.error(`Comic path does not exist "${filepath}"`)
return null
}
try {
return fs.readFile(filepath)
} catch (error) {
Logger.error(`Failed to read comic at "${filepath}"`, error)
return null
}
}
/**
* Extract cover image from comic return true if success
*
* @param {string} comicPath
* @param {string} comicImageFilepath
* @param {string} outputCoverPath
* @returns {Promise<boolean>}
*/
async function extractCoverImage(comicPath, comicImageFilepath, outputCoverPath) {
const comicFileBuffer = await getComicFileBuffer(comicPath)
if (!comicFileBuffer) return null
const archive = await Archive.open(comicFileBuffer)
const fileEntry = await archive.extractSingleFile(comicImageFilepath)
if (!fileEntry?.fileData) {
Logger.error(`[parseComicMetadata] Invalid file entry data for comicPath "${comicPath}"/${comicImageFilepath}`)
return false
}
try {
await fs.writeFile(outputCoverPath, fileEntry.fileData)
return true
} catch (error) {
Logger.error(`[parseComicMetadata] Failed to extract image from comicPath "${comicPath}"`, error)
return false
}
}
module.exports.extractCoverImage = extractCoverImage
/**
* Parse metadata from comic
*
* @param {import('../../models/Book').EBookFileObject} ebookFile
* @returns {Promise<import('./parseEbookMetadata').EBookFileScanData>}
*/
async function parse(ebookFile) {
const comicPath = ebookFile.metadata.path
Logger.debug(`Parsing metadata from comic at "${comicPath}"`)
const comicFileBuffer = await getComicFileBuffer(comicPath)
if (!comicFileBuffer) return null
const archive = await Archive.open(comicFileBuffer)
const fileObjects = await archive.getFilesArray()
fileObjects.sort((a, b) => {
return a.file.name.localeCompare(b.file.name, undefined, {
numeric: true,
sensitivity: 'base'
})
})
let metadata = null
const comicInfo = fileObjects.find(fo => fo.file.name === 'ComicInfo.xml')
if (comicInfo) {
const comicInfoEntry = await comicInfo.file.extract()
if (comicInfoEntry?.fileData) {
const comicInfoStr = new TextDecoder().decode(comicInfoEntry.fileData)
const comicInfoJson = await xmlToJSON(comicInfoStr)
if (comicInfoJson) {
metadata = parseComicInfoMetadata.parse(comicInfoJson)
}
}
}
const payload = {
path: comicPath,
ebookFormat: ebookFile.ebookFormat,
metadata
}
const firstImage = fileObjects.find(fo => globals.SupportedImageTypes.includes(Path.extname(fo.file.name).toLowerCase().slice(1)))
if (firstImage?.file?._path) {
payload.ebookCoverPath = firstImage.file._path
} else {
Logger.warn(`Cover image not found in comic at "${comicPath}"`)
}
return payload
}
module.exports.parse = parse

View File

@@ -0,0 +1,47 @@
const parseEpubMetadata = require('./parseEpubMetadata')
const parseComicMetadata = require('./parseComicMetadata')
/**
* @typedef EBookFileScanData
* @property {string} path
* @property {string} ebookFormat
* @property {string} ebookCoverPath internal image path
* @property {import('../../scanner/BookScanner').BookMetadataObject} metadata
*/
/**
* Parse metadata from ebook file
*
* @param {import('../../models/Book').EBookFileObject} ebookFile
* @returns {Promise<EBookFileScanData>}
*/
async function parse(ebookFile) {
if (!ebookFile) return null
if (ebookFile.ebookFormat === 'epub') {
return parseEpubMetadata.parse(ebookFile)
} else if (['cbz', 'cbr'].includes(ebookFile.ebookFormat)) {
return parseComicMetadata.parse(ebookFile)
}
return null
}
module.exports.parse = parse
/**
* Extract cover from ebook file
*
* @param {EBookFileScanData} ebookFileScanData
* @param {string} outputCoverPath
* @returns {Promise<boolean>}
*/
async function extractCoverImage(ebookFileScanData, outputCoverPath) {
if (!ebookFileScanData?.ebookCoverPath) return false
if (ebookFileScanData.ebookFormat === 'epub') {
return parseEpubMetadata.extractCoverImage(ebookFileScanData.path, ebookFileScanData.ebookCoverPath, outputCoverPath)
} else if (['cbz', 'cbr'].includes(ebookFileScanData.ebookFormat)) {
return parseComicMetadata.extractCoverImage(ebookFileScanData.path, ebookFileScanData.ebookCoverPath, outputCoverPath)
}
return false
}
module.exports.extractCoverImage = extractCoverImage

View File

@@ -0,0 +1,110 @@
const Path = require('path')
const Logger = require('../../Logger')
const StreamZip = require('../../libs/nodeStreamZip')
const parseOpfMetadata = require('./parseOpfMetadata')
const { xmlToJSON } = require('../index')
/**
* Extract file from epub and return string content
*
* @param {string} epubPath
* @param {string} filepath
* @returns {Promise<string>}
*/
async function extractFileFromEpub(epubPath, filepath) {
const zip = new StreamZip.async({ file: epubPath })
const data = await zip.entryData(filepath).catch((error) => {
Logger.error(`[parseEpubMetadata] Failed to extract ${filepath} from epub at "${epubPath}"`, error)
})
const filedata = data?.toString('utf8')
await zip.close()
return filedata
}
/**
* Extract an XML file from epub and return JSON
*
* @param {string} epubPath
* @param {string} xmlFilepath
* @returns {Promise<Object>}
*/
async function extractXmlToJson(epubPath, xmlFilepath) {
const filedata = await extractFileFromEpub(epubPath, xmlFilepath)
if (!filedata) return null
return xmlToJSON(filedata)
}
/**
* Extract cover image from epub return true if success
*
* @param {string} epubPath
* @param {string} epubImageFilepath
* @param {string} outputCoverPath
* @returns {Promise<boolean>}
*/
async function extractCoverImage(epubPath, epubImageFilepath, outputCoverPath) {
const zip = new StreamZip.async({ file: epubPath })
const success = await zip.extract(epubImageFilepath, outputCoverPath).then(() => true).catch((error) => {
Logger.error(`[parseEpubMetadata] Failed to extract image ${epubImageFilepath} from epub at "${epubPath}"`, error)
return false
})
await zip.close()
return success
}
module.exports.extractCoverImage = extractCoverImage
/**
* Parse metadata from epub
*
* @param {import('../../models/Book').EBookFileObject} ebookFile
* @returns {Promise<import('./parseEbookMetadata').EBookFileScanData>}
*/
async function parse(ebookFile) {
const epubPath = ebookFile.metadata.path
Logger.debug(`Parsing metadata from epub at "${epubPath}"`)
// Entrypoint of the epub that contains the filepath to the package document (opf file)
const containerJson = await extractXmlToJson(epubPath, 'META-INF/container.xml')
// Get package document opf filepath from container.xml
const packageDocPath = containerJson.container?.rootfiles?.[0]?.rootfile?.[0]?.$?.['full-path']
if (!packageDocPath) {
Logger.error(`Failed to get package doc path in Container.xml`, JSON.stringify(containerJson, null, 2))
return null
}
// Extract package document to JSON
const packageJson = await extractXmlToJson(epubPath, packageDocPath)
if (!packageJson) {
return null
}
// Parse metadata from package document opf file
const opfMetadata = parseOpfMetadata.parseOpfMetadataJson(packageJson)
if (!opfMetadata) {
Logger.error(`Unable to parse metadata in package doc with json`, JSON.stringify(packageJson, null, 2))
return null
}
const payload = {
path: epubPath,
ebookFormat: 'epub',
metadata: opfMetadata
}
// Attempt to find filepath to cover image
const manifestFirstImage = packageJson.package?.manifest?.[0]?.item?.find(item => item.$?.['media-type']?.startsWith('image/'))
let coverImagePath = manifestFirstImage?.$?.href
if (coverImagePath) {
const packageDirname = Path.dirname(packageDocPath)
payload.ebookCoverPath = Path.posix.join(packageDirname, coverImagePath)
} else {
Logger.warn(`Cover image not found in manifest for epub at "${epubPath}"`)
}
return payload
}
module.exports.parse = parse

View File

@@ -100,13 +100,28 @@ function fetchLanguage(metadata) {
}
function fetchSeries(metadataMeta) {
if (!metadataMeta) return null
return fetchTagString(metadataMeta, "calibre:series")
}
if (!metadataMeta) return []
const result = []
for (let i = 0; i < metadataMeta.length; i++) {
if (metadataMeta[i].$?.name === 'calibre:series' && metadataMeta[i].$.content?.trim()) {
const name = metadataMeta[i].$.content.trim()
let sequence = null
if (metadataMeta[i + 1]?.$?.name === 'calibre:series_index' && metadataMeta[i + 1].$?.content?.trim()) {
sequence = metadataMeta[i + 1].$.content.trim()
}
result.push({ name, sequence })
}
}
function fetchVolumeNumber(metadataMeta) {
if (!metadataMeta) return null
return fetchTagString(metadataMeta, "calibre:series_index")
// If one series was found with no series_index then check if any series_index meta can be found
// this is to support when calibre:series_index is not directly underneath calibre:series
if (result.length === 1 && !result[0].sequence) {
const seriesIndexMeta = metadataMeta.find(m => m.$?.name === 'calibre:series_index' && m.$.content?.trim())
if (seriesIndexMeta) {
result[0].sequence = seriesIndexMeta.$.content.trim()
}
}
return result
}
function fetchNarrators(creators, metadata) {
@@ -130,11 +145,7 @@ function stripPrefix(str) {
return str.split(':').pop()
}
module.exports.parseOpfMetadataXML = async (xml) => {
const json = await xmlToJSON(xml)
if (!json) return null
module.exports.parseOpfMetadataJson = (json) => {
// Handle <package ...> or with prefix <ns0:package ...>
const packageKey = Object.keys(json).find(key => stripPrefix(key) === 'package')
if (!packageKey) return null
@@ -161,7 +172,7 @@ module.exports.parseOpfMetadataXML = async (xml) => {
const creators = parseCreators(metadata)
const authors = (fetchCreators(creators, 'aut') || []).map(au => au?.trim()).filter(au => au)
const narrators = (fetchNarrators(creators, metadata) || []).map(nrt => nrt?.trim()).filter(nrt => nrt)
const data = {
return {
title: fetchTitle(metadata),
subtitle: fetchSubtitle(metadata),
authors,
@@ -173,9 +184,13 @@ module.exports.parseOpfMetadataXML = async (xml) => {
description: fetchDescription(metadata),
genres: fetchGenres(metadata),
language: fetchLanguage(metadata),
series: fetchSeries(metadata.meta),
sequence: fetchVolumeNumber(metadata.meta),
series: fetchSeries(metadataMeta),
tags: fetchTags(metadata)
}
return data
}
module.exports.parseOpfMetadataXML = async (xml) => {
const json = await xmlToJSON(xml)
if (!json) return null
return this.parseOpfMetadataJson(json)
}

View File

@@ -233,7 +233,7 @@ module.exports.getPodcastFeed = (feedUrl, excludeEpisodeMetadata = false) => {
method: 'GET',
timeout: 12000,
responseType: 'arraybuffer',
headers: { Accept: 'application/rss+xml' },
headers: { Accept: 'application/rss+xml, application/xhtml+xml, application/xml, */*;q=0.8' },
httpAgent: ssrfFilter(feedUrl),
httpsAgent: ssrfFilter(feedUrl)
}).then(async (data) => {

View File

@@ -0,0 +1,264 @@
const chai = require('chai')
const sinon = require('sinon')
const fs = require('../../../server/libs/fsExtra')
const fileUtils = require('../../../server/utils/fileUtils')
const which = require('../../../server/libs/which')
const ffbinaries = require('../../../server/libs/ffbinaries')
const path = require('path')
const BinaryManager = require('../../../server/managers/BinaryManager')
const expect = chai.expect
describe('BinaryManager', () => {
let binaryManager
describe('init', () => {
let findStub
let installStub
let errorStub
let exitStub
beforeEach(() => {
binaryManager = new BinaryManager()
findStub = sinon.stub(binaryManager, 'findRequiredBinaries')
installStub = sinon.stub(binaryManager, 'install')
errorStub = sinon.stub(console, 'error')
exitStub = sinon.stub(process, 'exit')
})
afterEach(() => {
findStub.restore()
installStub.restore()
errorStub.restore()
exitStub.restore()
})
it('should not install binaries if they are already found', async () => {
findStub.resolves([])
await binaryManager.init()
expect(installStub.called).to.be.false
expect(findStub.calledOnce).to.be.true
expect(errorStub.called).to.be.false
expect(exitStub.called).to.be.false
})
it('should install missing binaries', async () => {
const missingBinaries = ['ffmpeg', 'ffprobe']
const missingBinariesAfterInstall = []
findStub.onFirstCall().resolves(missingBinaries)
findStub.onSecondCall().resolves(missingBinariesAfterInstall)
await binaryManager.init()
expect(findStub.calledTwice).to.be.true
expect(installStub.calledOnce).to.be.true
expect(errorStub.called).to.be.false
expect(exitStub.called).to.be.false
})
it('exit if binaries are not found after installation', async () => {
const missingBinaries = ['ffmpeg', 'ffprobe']
const missingBinariesAfterInstall = ['ffmpeg', 'ffprobe']
findStub.onFirstCall().resolves(missingBinaries)
findStub.onSecondCall().resolves(missingBinariesAfterInstall)
await binaryManager.init()
expect(findStub.calledTwice).to.be.true
expect(installStub.calledOnce).to.be.true
expect(errorStub.calledOnce).to.be.true
expect(exitStub.calledOnce).to.be.true
expect(exitStub.calledWith(1)).to.be.true
})
})
describe('findRequiredBinaries', () => {
let findBinaryStub
beforeEach(() => {
const requiredBinaries = [{ name: 'ffmpeg', envVariable: 'FFMPEG_PATH' }]
binaryManager = new BinaryManager(requiredBinaries)
findBinaryStub = sinon.stub(binaryManager, 'findBinary')
})
afterEach(() => {
findBinaryStub.restore()
})
it('should put found paths in the correct environment variables', async () => {
const pathToFFmpeg = '/path/to/ffmpeg'
const missingBinaries = []
delete process.env.FFMPEG_PATH
findBinaryStub.resolves(pathToFFmpeg)
const result = await binaryManager.findRequiredBinaries()
expect(result).to.deep.equal(missingBinaries)
expect(findBinaryStub.calledOnce).to.be.true
expect(process.env.FFMPEG_PATH).to.equal(pathToFFmpeg)
})
it('should add missing binaries to result', async () => {
const missingBinaries = ['ffmpeg']
delete process.env.FFMPEG_PATH
findBinaryStub.resolves(null)
const result = await binaryManager.findRequiredBinaries()
expect(result).to.deep.equal(missingBinaries)
expect(findBinaryStub.calledOnce).to.be.true
expect(process.env.FFMPEG_PATH).to.be.undefined
})
})
describe('install', () => {
let isWritableStub
let downloadBinariesStub
beforeEach(() => {
binaryManager = new BinaryManager()
isWritableStub = sinon.stub(fileUtils, 'isWritable')
downloadBinariesStub = sinon.stub(ffbinaries, 'downloadBinaries')
binaryManager.mainInstallPath = '/path/to/main/install'
binaryManager.altInstallPath = '/path/to/alt/install'
})
afterEach(() => {
isWritableStub.restore()
downloadBinariesStub.restore()
})
it('should not install binaries if no binaries are passed', async () => {
const binaries = []
await binaryManager.install(binaries)
expect(isWritableStub.called).to.be.false
expect(downloadBinariesStub.called).to.be.false
})
it('should install binaries in main install path if has access', async () => {
const binaries = ['ffmpeg']
const destination = binaryManager.mainInstallPath
isWritableStub.withArgs(destination).resolves(true)
downloadBinariesStub.resolves()
await binaryManager.install(binaries)
expect(isWritableStub.calledOnce).to.be.true
expect(downloadBinariesStub.calledOnce).to.be.true
expect(downloadBinariesStub.calledWith(binaries, sinon.match({ destination: destination }))).to.be.true
})
it('should install binaries in alt install path if has no access to main', async () => {
const binaries = ['ffmpeg']
const mainDestination = binaryManager.mainInstallPath
const destination = binaryManager.altInstallPath
isWritableStub.withArgs(mainDestination).resolves(false)
downloadBinariesStub.resolves()
await binaryManager.install(binaries)
expect(isWritableStub.calledOnce).to.be.true
expect(downloadBinariesStub.calledOnce).to.be.true
expect(downloadBinariesStub.calledWith(binaries, sinon.match({ destination: destination }))).to.be.true
})
})
})
describe('findBinary', () => {
let binaryManager
let fsPathExistsStub
let whichSyncStub
let mainInstallPath
let altInstallPath
const name = 'ffmpeg'
const envVariable = 'FFMPEG_PATH'
const defaultPath = '/path/to/ffmpeg'
const executable = name + (process.platform == 'win32' ? '.exe' : '')
const whichPath = '/usr/bin/ffmpeg'
beforeEach(() => {
binaryManager = new BinaryManager()
fsPathExistsStub = sinon.stub(fs, 'pathExists')
whichSyncStub = sinon.stub(which, 'sync')
binaryManager.mainInstallPath = '/path/to/main/install'
mainInstallPath = path.join(binaryManager.mainInstallPath, executable)
binaryManager.altInstallPath = '/path/to/alt/install'
altInstallPath = path.join(binaryManager.altInstallPath, executable)
})
afterEach(() => {
fsPathExistsStub.restore()
whichSyncStub.restore()
})
it('should return defaultPath if it exists', async () => {
process.env[envVariable] = defaultPath
fsPathExistsStub.withArgs(defaultPath).resolves(true)
const result = await binaryManager.findBinary(name, envVariable)
expect(result).to.equal(defaultPath)
expect(fsPathExistsStub.calledOnceWith(defaultPath)).to.be.true
expect(whichSyncStub.notCalled).to.be.true
})
it('should return whichPath if it exists', async () => {
delete process.env[envVariable]
whichSyncStub.returns(whichPath)
const result = await binaryManager.findBinary(name, envVariable)
expect(result).to.equal(whichPath)
expect(fsPathExistsStub.notCalled).to.be.true
expect(whichSyncStub.calledOnce).to.be.true
})
it('should return mainInstallPath if it exists', async () => {
delete process.env[envVariable]
whichSyncStub.returns(null)
fsPathExistsStub.withArgs(mainInstallPath).resolves(true)
const result = await binaryManager.findBinary(name, envVariable)
expect(result).to.equal(mainInstallPath)
expect(whichSyncStub.calledOnce).to.be.true
expect(fsPathExistsStub.calledOnceWith(mainInstallPath)).to.be.true
})
it('should return altInstallPath if it exists', async () => {
delete process.env[envVariable]
whichSyncStub.returns(null)
fsPathExistsStub.withArgs(mainInstallPath).resolves(false)
fsPathExistsStub.withArgs(altInstallPath).resolves(true)
const result = await binaryManager.findBinary(name, envVariable)
expect(result).to.equal(altInstallPath)
expect(whichSyncStub.calledOnce).to.be.true
expect(fsPathExistsStub.calledTwice).to.be.true
expect(fsPathExistsStub.calledWith(mainInstallPath)).to.be.true
expect(fsPathExistsStub.calledWith(altInstallPath)).to.be.true
})
it('should return null if binary is not found', async () => {
delete process.env[envVariable]
whichSyncStub.returns(null)
fsPathExistsStub.withArgs(mainInstallPath).resolves(false)
fsPathExistsStub.withArgs(altInstallPath).resolves(false)
const result = await binaryManager.findBinary(name, envVariable)
expect(result).to.be.null
expect(whichSyncStub.calledOnce).to.be.true
expect(fsPathExistsStub.calledTwice).to.be.true
expect(fsPathExistsStub.calledWith(mainInstallPath)).to.be.true
expect(fsPathExistsStub.calledWith(altInstallPath)).to.be.true
})
})

View File

@@ -0,0 +1,130 @@
const chai = require('chai')
const expect = chai.expect
const { parseOpfMetadataXML } = require('../../../../server/utils/parsers/parseOpfMetadata')
describe('parseOpfMetadata - test series', async () => {
it('test one series', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content="Serie"/>
<meta name="calibre:series_index" content="1"/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([{ "name": "Serie", "sequence": "1" }])
})
it('test more then 1 series - in correct order', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content="Serie 1"/>
<meta name="calibre:series_index" content="1"/>
<meta name="calibre:series" content="Serie 2"/>
<meta name="calibre:series_index" content="2"/>
<meta name="calibre:series" content="Serie 3"/>
<meta name="calibre:series_index" content="3"/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([
{ "name": "Serie 1", "sequence": "1" },
{ "name": "Serie 2", "sequence": "2" },
{ "name": "Serie 3", "sequence": "3" },
])
})
it('test messed order of series content and index', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content="Serie 1"/>
<meta name="calibre:series_index" content="1"/>
<meta name="calibre:series_index" content="2"/>
<meta name="calibre:series_index" content="3"/>
<meta name="calibre:series" content="Serie 3"/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([
{ "name": "Serie 1", "sequence": "1" },
{ "name": "Serie 3", "sequence": null },
])
})
it('test different values of series content and index', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content="Serie 1"/>
<meta name="calibre:series_index"/>
<meta name="calibre:series" content="Serie 2"/>
<meta name="calibre:series_index" content="abc"/>
<meta name="calibre:series" content="Serie 3"/>
<meta name="calibre:series_index" content=""/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([
{ "name": "Serie 1", "sequence": null },
{ "name": "Serie 2", "sequence": "abc" },
{ "name": "Serie 3", "sequence": null },
])
})
it('test empty series content', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content=""/>
<meta name="calibre:series_index" content=""/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([])
})
it('test series and index using an xml namespace', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<ns0:package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<ns0:metadata>
<ns0:meta name="calibre:series" content="Serie 1"/>
<ns0:meta name="calibre:series_index" content=""/>
</ns0:metadata>
</ns0:package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([
{ "name": "Serie 1", "sequence": null }
])
})
it('test series and series index not directly underneath', async () => {
const opf = `
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf" xml:lang="en" version="3.0" unique-identifier="bookid">
<metadata>
<meta name="calibre:series" content="Serie 1"/>
<meta name="calibre:title_sort" content="Test Title"/>
<meta name="calibre:series_index" content="1"/>
</metadata>
</package>
`
const parsedOpf = await parseOpfMetadataXML(opf)
expect(parsedOpf.series).to.deep.equal([
{ "name": "Serie 1", "sequence": "1" }
])
})
})