feat: use SQLite for storing package index in the content-addressable store (#10827)

## Summary

Replace individual `.mpk` (MessagePack) files under `$STORE/index/` with a single SQLite database at `$STORE/index.db` using Node.js 22's built-in `node:sqlite` module. This reduces filesystem syscall overhead and improves space efficiency for small metadata entries.

Closes #10826

## Design

### New package: `@pnpm/store.index`

A new `StoreIndex` class wraps a SQLite database with a simple key-value API (`get`, `set`, `delete`, `has`, `entries`). Data is serialized with msgpackr and stored as BLOBs. The table uses `WITHOUT ROWID` for compact storage.

Key design decisions:

- **WAL mode** enables concurrent reads from workers while the main process writes.
- **`busy_timeout=5000`** plus a retry loop with `Atomics.wait`-based `sleepSync` handles `SQLITE_BUSY` errors from concurrent access.
- **Performance PRAGMAs**: `synchronous=NORMAL`, `mmap_size=512MB`, `cache_size=32MB`, `temp_store=MEMORY`, `wal_autocheckpoint=10000`.
- **Write batching**: `queueWrites()` batches pre-packed entries from tarball extraction and flushes them in a single transaction on `process.nextTick`. `setRawMany()` writes immediate batches (e.g. from `addFilesFromDir`).
- **Lifecycle**: `close()` auto-flushes pending writes, runs `PRAGMA optimize`, and closes the DB. A `process.on('exit')` handler ensures cleanup even on unexpected exits.
- **`VACUUM` after `deleteMany`** (used by `pnpm store prune`) to reclaim disk space.

### Key format

Keys are `integrity\tpkgId` (tab-separated). Git-hosted packages use `pkgId\tbuilt` or `pkgId\tnot-built`.

### Shared StoreIndex instance

A single `StoreIndex` instance is threaded through the entire install lifecycle — from `createNewStoreController` through the fetcher chain, package requester, license scanner, SBOM collector, and dependencies hierarchy. This replaces the previous pattern of each component creating its own file-based index access.

### Worker architecture

Index writes are performed in the main process, not in worker threads. Workers send pre-packed `{ key, buffer }` pairs back to the main process via `postMessage`, where they are batched and flushed to SQLite. This avoids SQLite write contention between threads.

### SQLite ExperimentalWarning suppression

`node:sqlite` emits an `ExperimentalWarning` on first load. This is suppressed via a `process.emitWarning` override injected through esbuild's `banner` option, which runs on line 1 of both `dist/pnpm.mjs` and `dist/worker.js` — before any module that loads `node:sqlite`.

### No migration from `.mpk` files

Old `.mpk` index files are not migrated. Packages missing from the new SQLite index are re-fetched on demand (the same behavior as a fresh store).

## Changed packages

121 files changed across these areas:

- **`store/index/`** — New `@pnpm/store.index` package
- **`worker/`** — Write batching moved from worker module into `StoreIndex` class; workers send pre-packed buffers to main process
- **`store/package-store/`** — StoreIndex creation and lifecycle management
- **`store/cafs/`** — Removed `getFilePathInCafs` index-file utilities (no longer needed)
- **`store/pkg-finder/`** — Reads from StoreIndex instead of `.mpk` files
- **`store/plugin-commands-store/`** — `store status` uses StoreIndex
- **`store/plugin-commands-store-inspecting/`** — `cat-index` and `find-hash` use StoreIndex
- **`fetching/tarball-fetcher/`** — Threads StoreIndex through fetchers; git-hosted fetcher flushes before reading
- **`fetching/git-fetcher/`, `binary-fetcher/`, `pick-fetcher/`** — Accept StoreIndex parameter
- **`pkg-manager/`** — `client`, `core`, `headless`, `package-requester` thread StoreIndex
- **`reviewing/`** — `license-scanner`, `sbom`, `dependencies-hierarchy` accept StoreIndex
- **`cache/api/`** — Cache view uses StoreIndex
- **`pnpm/bundle.ts`** — esbuild banner for ExperimentalWarning suppression

## Test plan

- [x] `pnpm --filter @pnpm/store.index test` — Unit tests for StoreIndex CRUD and batching
- [x] `pnpm --filter @pnpm/package-store test` — Store controller lifecycle
- [x] `pnpm --filter @pnpm/package-requester test` — Package requester reads from SQLite index
- [x] `pnpm --filter @pnpm/tarball-fetcher test` — Tarball and git-hosted fetcher writes
- [x] `pnpm --filter @pnpm/headless test` — Headless install
- [x] `pnpm --filter @pnpm/core test` — Core install, side effects, patching
- [x] `pnpm --filter @pnpm/plugin-commands-rebuild test` — Rebuild reads from index
- [x] `pnpm --filter @pnpm/license-scanner test` — License scanning
- [x] e2e tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)
This commit is contained in:
Zoltan Kochan
2026-03-06 12:59:04 +01:00
committed by GitHub
parent 0bf7051fb3
commit b7f0f21582
121 changed files with 1642 additions and 622 deletions

View File

@@ -0,0 +1,14 @@
---
"@pnpm/store.index": minor
"@pnpm/store.cafs": minor
"@pnpm/worker": minor
"@pnpm/plugin-commands-store-inspecting": minor
"@pnpm/plugin-commands-store": minor
"@pnpm/package-store": minor
"@pnpm/store.pkg-finder": minor
"@pnpm/reviewing.dependencies-hierarchy": minor
"@pnpm/plugin-commands-rebuild": minor
"pnpm": minor
---
Use SQLite for storing package index in the content-addressable store. Instead of individual `.mpk` files under `$STORE/index/`, package metadata is now stored in a single SQLite database at `$STORE/index.db`. This reduces filesystem syscall overhead, improves space efficiency for small metadata entries, and enables concurrent access via SQLite's WAL mode. Packages missing from the new index are re-fetched on demand [#10826](https://github.com/pnpm/pnpm/issues/10826).

View File

@@ -33,7 +33,8 @@
},
"dependencies": {
"@pnpm/registry-mock": "catalog:",
"@pnpm/store.cafs": "workspace:*"
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*"
},
"devDependencies": {
"@pnpm/assert-store": "workspace:*",

View File

@@ -1,6 +1,6 @@
import fs from 'fs'
import path from 'path'
import { getIndexFilePathInCafs } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { getIntegrity, REGISTRY_MOCK_PORT } from '@pnpm/registry-mock'
export interface StoreAssertions {
@@ -24,15 +24,25 @@ export function assertStore (
const store = {
getPkgIndexFilePath (pkgName: string, version: string): string {
const integrity = getIntegrity(pkgName, version)
return getIndexFilePathInCafs(storePath, integrity, `${pkgName}@${version}`)
return storeIndexKey(integrity, `${pkgName}@${version}`)
},
cafsHas (pkgName: string, version: string): void {
const pathToCheck = store.getPkgIndexFilePath(pkgName, version)
ok(fs.existsSync(pathToCheck))
const storeIndex = new StoreIndex(storePath)
try {
ok(storeIndex.get(pathToCheck) != null)
} finally {
storeIndex.close()
}
},
cafsHasNot (pkgName: string, version: string): void {
const pathToCheck = store.getPkgIndexFilePath(pkgName, version)
notOk(fs.existsSync(pathToCheck))
const storeIndex = new StoreIndex(storePath)
try {
notOk(storeIndex.get(pathToCheck) != null)
} finally {
storeIndex.close()
}
},
storeHas (pkgName: string, version?: string): void {
const pathToCheck = store.resolve(pkgName, version)

View File

@@ -14,6 +14,9 @@
},
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
}
]
}

View File

@@ -7,6 +7,10 @@ export default {
// Unfortunately, this means that if two such tests will run at the same time,
// they may break each other.
maxWorkers: 1,
// Force Jest to exit after globalTeardown completes. The Verdaccio server
// and lifecycle child-processes spawned during tests may leave ref'd handles
// that prevent the process from exiting on its own.
forceExit: true,
globalSetup: path.join(import.meta.dirname, 'globalSetup.js'),
globalTeardown: path.join(import.meta.dirname, 'globalTeardown.js'),
}

View File

@@ -36,6 +36,7 @@
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/npm-resolver": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"encode-registry": "catalog:",
"tinyglobby": "catalog:"
},

View File

@@ -1,8 +1,7 @@
import fs from 'fs'
import path from 'path'
import { glob } from 'tinyglobby'
import { readMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { getIndexFilePathInCafs } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { type PackageMeta } from '@pnpm/npm-resolver'
import getRegistryName from 'encode-registry'
@@ -20,35 +19,40 @@ export async function cacheView (opts: { cacheDir: string, storeDir: string, reg
expandDirectories: false,
})).sort()
const metaFilesByPath: Record<string, CachedVersions> = {}
for (const filePath of metaFilePaths) {
let metaObject: PackageMeta | null
try {
metaObject = readMsgpackFileSync<PackageMeta>(path.join(opts.cacheDir, filePath))
} catch {
continue
}
if (!metaObject) continue
const cachedVersions: string[] = []
const nonCachedVersions: string[] = []
for (const [version, manifest] of Object.entries(metaObject.versions)) {
if (!manifest.dist.integrity) continue
const indexFilePath = getIndexFilePathInCafs(opts.storeDir, manifest.dist.integrity, `${manifest.name}@${manifest.version}`)
if (fs.existsSync(indexFilePath)) {
cachedVersions.push(version)
} else {
nonCachedVersions.push(version)
const storeIndex = new StoreIndex(opts.storeDir)
try {
for (const filePath of metaFilePaths) {
let metaObject: PackageMeta | null
try {
metaObject = readMsgpackFileSync<PackageMeta>(path.join(opts.cacheDir, filePath))
} catch {
continue
}
if (!metaObject) continue
const cachedVersions: string[] = []
const nonCachedVersions: string[] = []
for (const [version, manifest] of Object.entries(metaObject.versions)) {
if (!manifest.dist.integrity) continue
const key = storeIndexKey(manifest.dist.integrity, `${manifest.name}@${manifest.version}`)
if (storeIndex.has(key)) {
cachedVersions.push(version)
} else {
nonCachedVersions.push(version)
}
}
let registryName = filePath
while (path.dirname(registryName) !== '.') {
registryName = path.dirname(registryName)
}
metaFilesByPath[registryName.replaceAll('+', ':')] = {
cachedVersions,
nonCachedVersions,
cachedAt: metaObject.cachedAt ? new Date(metaObject.cachedAt).toString() : undefined,
distTags: metaObject['dist-tags'],
}
}
let registryName = filePath
while (path.dirname(registryName) !== '.') {
registryName = path.dirname(registryName)
}
metaFilesByPath[registryName.replaceAll('+', ':')] = {
cachedVersions,
nonCachedVersions,
cachedAt: metaObject.cachedAt ? new Date(metaObject.cachedAt).toString() : undefined,
distTags: metaObject['dist-tags'],
}
} finally {
storeIndex.close()
}
return JSON.stringify(metaFilesByPath, null, 2)
}

View File

@@ -26,6 +26,9 @@
},
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
}
]
}

View File

@@ -9,6 +9,7 @@
"archy",
"argumentless",
"armv",
"autocheckpoint",
"autocompleting",
"autofix",
"autofixed",
@@ -76,6 +77,7 @@
"enten",
"eperm",
"epipe",
"errcode",
"etamponi",
"exdev",
"execa",
@@ -155,6 +157,7 @@
"metafile",
"millis",
"mintimeout",
"mmap",
"monorepolint",
"moonrepo",
"mountpoint",
@@ -297,6 +300,8 @@
"supercede",
"syml",
"syncer",
"syscall",
"syscalls",
"szia",
"tabtab",
"taffydb",

View File

@@ -38,6 +38,7 @@
"@pnpm/fetching-types": "workspace:*",
"@pnpm/fetching.binary-fetcher": "workspace:*",
"@pnpm/node.resolver": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/tarball-fetcher": "workspace:*",
"detect-libc": "catalog:"
},

View File

@@ -6,6 +6,7 @@ import {
} from '@pnpm/fetching-types'
import { createCafsStore } from '@pnpm/create-cafs-store'
import { type Cafs } from '@pnpm/cafs-types'
import { type StoreIndex } from '@pnpm/store.index'
import { createTarballFetcher } from '@pnpm/tarball-fetcher'
import {
getNodeArtifactAddress,
@@ -17,6 +18,7 @@ import { isNonGlibcLinux } from 'detect-libc'
export interface FetchNodeOptionsToDir {
storeDir: string
storeIndex: StoreIndex
fetchTimeout?: number
nodeMirrorBaseUrl?: string
retry?: RetryTimeoutOptions
@@ -167,6 +169,7 @@ async function downloadAndUnpackTarballToDir (
const fetchers = createTarballFetcher(fetch, getAuthHeader, {
retry: opts.retry,
timeout: opts.fetchTimeout,
storeIndex: opts.storeIndex,
// These are not needed for fetching Node.js
rawConfig: {},
unsafePerm: false,

View File

@@ -3,6 +3,7 @@ import { Response } from 'node-fetch'
import path from 'path'
import { Readable } from 'stream'
import type { FetchNodeOptionsToDir as FetchNodeOptions } from '@pnpm/node.fetcher'
import { StoreIndex } from '@pnpm/store.index'
import { tempDir } from '@pnpm/prepare'
import { jest } from '@jest/globals'
@@ -38,6 +39,11 @@ const fetchMock = jest.fn(async (url: string) => {
return new Response(Readable.from(Buffer.alloc(0)))
})
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
beforeEach(() => {
jest.mocked(isNonGlibcLinux).mockReturnValue(Promise.resolve(false))
fetchMock.mockClear()
@@ -47,9 +53,13 @@ test.skip('install Node using a custom node mirror', async () => {
tempDir()
const nodeMirrorBaseUrl = 'https://pnpm-node-mirror-test.localhost/download/release/'
const storeDir = path.resolve('store')
const storeIndex = new StoreIndex(storeDir)
storeIndexes.push(storeIndex)
const opts: FetchNodeOptions = {
nodeMirrorBaseUrl,
storeDir: path.resolve('store'),
storeDir,
storeIndex,
}
await fetchNode(fetchMock, '16.4.0', path.resolve('node'), opts)
@@ -62,8 +72,12 @@ test.skip('install Node using a custom node mirror', async () => {
test.skip('install Node using the default node mirror', async () => {
tempDir()
const storeDir = path.resolve('store')
const storeIndex = new StoreIndex(storeDir)
storeIndexes.push(storeIndex)
const opts: FetchNodeOptions = {
storeDir: path.resolve('store'),
storeDir,
storeIndex,
}
await fetchNode(fetchMock, '16.4.0', path.resolve('node'), opts)
@@ -81,9 +95,12 @@ test('auto-detects musl on non-glibc Linux and uses unofficial-builds mirror', a
// The function will throw because the downloaded tarball content won't match
// the fake sha256 we put in the SHASUMS256.txt mock, but all fetch calls are
// recorded before the integrity check, so we can assert the correct URLs.
const storeIndex = new StoreIndex(path.resolve('store'))
storeIndexes.push(storeIndex)
await expect(
fetchNode(fetchMock, '22.0.0', path.resolve('node'), {
storeDir: path.resolve('store'),
storeIndex,
platform: 'linux',
arch: 'x64',
retry: { retries: 0 },

View File

@@ -30,6 +30,9 @@
{
"path": "../../store/create-cafs-store"
},
{
"path": "../../store/index"
},
{
"path": "../node.resolver"
}

View File

@@ -43,7 +43,6 @@
"@pnpm/deps.graph-sequencer": "workspace:*",
"@pnpm/error": "workspace:*",
"@pnpm/exec.pkg-requires-build": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/get-context": "workspace:*",
"@pnpm/lifecycle": "workspace:*",
"@pnpm/link-bins": "workspace:*",
@@ -58,6 +57,7 @@
"@pnpm/store-connection-manager": "workspace:*",
"@pnpm/store-controller-types": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"@pnpm/workspace.find-packages": "workspace:*",
"load-json-file": "catalog:",

View File

@@ -1,7 +1,8 @@
import assert from 'assert'
import path from 'path'
import util from 'util'
import { getIndexFilePathInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { storeIndexKey } from '@pnpm/store.index'
import { calcDepState, lockfileToDepGraph, type DepsStateCache } from '@pnpm/calc-dep-state'
import {
LAYOUT_VERSION,
@@ -36,7 +37,7 @@ import {
import { createAllowBuildFunction } from '@pnpm/builder.policy'
import { pkgRequiresBuild } from '@pnpm/exec.pkg-requires-build'
import * as dp from '@pnpm/dependency-path'
import { readMsgpackFile } from '@pnpm/fs.msgpack-file'
import { StoreIndex } from '@pnpm/store.index'
import { safeReadPackageJsonFromDir } from '@pnpm/read-package-json'
import { hardLinkDir } from '@pnpm/worker'
import { runGroups } from 'run-groups'
@@ -328,6 +329,7 @@ async function _rebuild (
return false
}
const builtDepPaths = new Set<string>()
const storeIndex = opts.skipIfHasSideEffectsCache ? new StoreIndex(opts.storeDir) : undefined
const groups = chunks.map((chunk) => chunk.filter((depPath) => ctx.pkgsToRebuild.has(depPath) && !ctx.skipped.has(depPath)).map((depPath) =>
async () => {
@@ -356,11 +358,8 @@ async function _rebuild (
let sideEffectsCacheKey: string | undefined
const pkgId = `${pkgInfo.name}@${pkgInfo.version}`
if (opts.skipIfHasSideEffectsCache && resolution.integrity) {
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, resolution.integrity!.toString(), pkgId)
let pkgFilesIndex: PackageFilesIndex | undefined
try {
pkgFilesIndex = await readMsgpackFile<PackageFilesIndex>(filesIndexFile)
} catch {}
const filesIndexFile = storeIndexKey(resolution.integrity!.toString(), pkgId)
const pkgFilesIndex = storeIndex!.get(filesIndexFile) as PackageFilesIndex | undefined
if (pkgFilesIndex) {
sideEffectsCacheKey = calcDepState(depGraph, depsStateCache, depPath, {
includeDepGraphHash: true,
@@ -393,7 +392,7 @@ async function _rebuild (
})
if (hasSideEffects && (opts.sideEffectsCacheWrite ?? true) && resolution.integrity) {
builtDepPaths.add(depPath)
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, resolution.integrity!.toString(), pkgId)
const filesIndexFile = storeIndexKey(resolution.integrity!.toString(), pkgId)
try {
if (!sideEffectsCacheKey) {
sideEffectsCacheKey = calcDepState(depGraph, depsStateCache, depPath, {
@@ -439,6 +438,7 @@ async function _rebuild (
))
await runGroups(opts.childConcurrency || 5, groups)
storeIndex?.close()
if (builtDepPaths.size > 0) {
// It may be optimized because some bins were already linked before running lifecycle scripts

View File

@@ -1,8 +1,8 @@
/// <reference path="../../../__typings__/index.d.ts" />
import fs from 'fs'
import path from 'path'
import { readMsgpackFileSync, writeMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { getIndexFilePathInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { ENGINE_NAME, STORE_VERSION, WANTED_LOCKFILE } from '@pnpm/constants'
import { hashObject } from '@pnpm/crypto.object-hasher'
import { rebuild } from '@pnpm/plugin-commands-rebuild'
@@ -17,6 +17,11 @@ const REGISTRY = `http://localhost:${REGISTRY_MOCK_PORT}/`
const pnpmBin = path.join(import.meta.dirname, '../../../pnpm/bin/pnpm.mjs')
const f = fixtures(import.meta.dirname)
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test('rebuilds dependencies', async () => {
const project = prepare()
const cacheDir = path.resolve('cache')
@@ -76,8 +81,10 @@ test('rebuilds dependencies', async () => {
])
}
const cacheIntegrityPath = getIndexFilePathInCafs(path.join(storeDir, STORE_VERSION), getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const cacheIntegrity = readMsgpackFileSync<PackageFilesIndex>(cacheIntegrityPath)!
const cacheIntegrityPath = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex1 = new StoreIndex(path.join(storeDir, STORE_VERSION))
storeIndexes.push(storeIndex1)
const cacheIntegrity = storeIndex1.get(cacheIntegrityPath) as PackageFilesIndex
expect(cacheIntegrity!.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};deps=${hashObject({
id: `@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0:${getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0')}`,
@@ -109,8 +116,10 @@ test('skipIfHasSideEffectsCache', async () => {
'--config.enableGlobalVirtualStore=false',
])
const cacheIntegrityPath = getIndexFilePathInCafs(path.join(storeDir, STORE_VERSION), getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
let cacheIntegrity = readMsgpackFileSync<PackageFilesIndex>(cacheIntegrityPath)!
const cacheIntegrityPath = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex = new StoreIndex(path.join(storeDir, STORE_VERSION))
storeIndexes.push(storeIndex)
let cacheIntegrity = storeIndex.get(cacheIntegrityPath) as PackageFilesIndex
const sideEffectsKey = `${ENGINE_NAME};deps=${hashObject({ '@pnpm.e2e/hello-world-js-bin@1.0.0': {} })}`
cacheIntegrity.sideEffects = new Map([
[sideEffectsKey, {
@@ -123,7 +132,7 @@ test('skipIfHasSideEffectsCache', async () => {
]),
}],
])
writeMsgpackFileSync(cacheIntegrityPath, cacheIntegrity)
storeIndex.set(cacheIntegrityPath, cacheIntegrity)
let modules = project.readModulesManifest()
expect(modules!.pendingBuilds).toStrictEqual([
@@ -146,7 +155,7 @@ test('skipIfHasSideEffectsCache', async () => {
expect(modules).toBeTruthy()
expect(modules!.pendingBuilds).toHaveLength(0)
cacheIntegrity = readMsgpackFileSync<PackageFilesIndex>(cacheIntegrityPath)!
cacheIntegrity = storeIndex.get(cacheIntegrityPath) as PackageFilesIndex
expect(cacheIntegrity!.sideEffects).toBeTruthy()
expect(cacheIntegrity!.sideEffects!.get(sideEffectsKey)?.added?.has('foo')).toBeTruthy()
})

View File

@@ -42,9 +42,6 @@
{
"path": "../../deps/graph-sequencer"
},
{
"path": "../../fs/msgpack-file"
},
{
"path": "../../lockfile/types"
},
@@ -90,6 +87,9 @@
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
},
{
"path": "../../store/store-connection-manager"
},

View File

@@ -34,6 +34,7 @@
"@pnpm/error": "workspace:*",
"@pnpm/fetcher-base": "workspace:*",
"@pnpm/fetching-types": "workspace:*",
"@pnpm/store.index": "workspace:*",
"adm-zip": "catalog:",
"is-subdir": "catalog:",
"rename-overwrite": "catalog:",

View File

@@ -3,6 +3,7 @@ import fsPromises from 'fs/promises'
import { PnpmError } from '@pnpm/error'
import { type FetchFromRegistry } from '@pnpm/fetching-types'
import { type BinaryFetcher, type FetchFunction, type FetchResult } from '@pnpm/fetcher-base'
import { type StoreIndex } from '@pnpm/store.index'
import { addFilesFromDir } from '@pnpm/worker'
import AdmZip from 'adm-zip'
import isSubdir from 'is-subdir'
@@ -14,6 +15,7 @@ export function createBinaryFetcher (ctx: {
fetch: FetchFromRegistry
fetchFromRemoteTarball: FetchFunction
rawConfig: Record<string, string>
storeIndex: StoreIndex
offline?: boolean
}): { binary: BinaryFetcher } {
const fetchBinary: BinaryFetcher = async (cafs, resolution, opts) => {
@@ -48,6 +50,7 @@ export function createBinaryFetcher (ctx: {
}, tempLocation)
fetchResult = await addFilesFromDir({
storeDir: cafs.storeDir,
storeIndex: ctx.storeIndex,
dir: tempLocation,
filesIndexFile: opts.filesIndexFile,
readManifest: false,

View File

@@ -15,6 +15,9 @@
{
"path": "../../packages/error"
},
{
"path": "../../store/index"
},
{
"path": "../../worker"
},

View File

@@ -36,6 +36,7 @@
"@pnpm/fetcher-base": "workspace:*",
"@pnpm/fs.packlist": "workspace:*",
"@pnpm/prepare-package": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@zkochan/rimraf": "catalog:",
"execa": "catalog:"
},

View File

@@ -5,6 +5,7 @@ import type { GitFetcher } from '@pnpm/fetcher-base'
import { packlist } from '@pnpm/fs.packlist'
import { globalWarn } from '@pnpm/logger'
import { preparePackage } from '@pnpm/prepare-package'
import { type StoreIndex } from '@pnpm/store.index'
import { addFilesFromDir } from '@pnpm/worker'
import { PnpmError } from '@pnpm/error'
import rimraf from '@zkochan/rimraf'
@@ -14,6 +15,7 @@ import { URL } from 'url'
export interface CreateGitFetcherOptions {
gitShallowHosts?: string[]
rawConfig: Record<string, unknown>
storeIndex: StoreIndex
unsafePerm?: boolean
ignoreScripts?: boolean
}
@@ -61,6 +63,7 @@ export function createGitFetcher (createOpts: CreateGitFetcherOptions): { git: G
// the linking of files to the store is in progress.
return addFilesFromDir({
storeDir: cafs.storeDir,
storeIndex: createOpts.storeIndex,
dir: pkgDir,
files,
filesIndexFile: opts.filesIndexFile,

View File

@@ -1,6 +1,7 @@
/// <reference path="../../../__typings__/index.d.ts"/>
import path from 'path'
import { createCafsStore } from '@pnpm/create-cafs-store'
import { StoreIndex } from '@pnpm/store.index'
import { jest } from '@jest/globals'
import { temporaryDirectory } from 'tempy'
import { lexCompare } from '@pnpm/util.lex-comparator'
@@ -29,6 +30,17 @@ const { globalWarn } = await import('@pnpm/logger')
const { default: execa } = await import('execa')
const { createGitFetcher } = await import('@pnpm/git-fetcher')
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
function createStoreIndex (storeDir: string): StoreIndex {
const si = new StoreIndex(storeDir)
storeIndexes.push(si)
return si
}
beforeEach(() => {
jest.mocked(execa).mockClear()
jest.mocked(globalWarn).mockClear()
@@ -36,7 +48,7 @@ beforeEach(() => {
test('fetch', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap, manifest } = await fetch(
createCafsStore(storeDir),
{
@@ -55,7 +67,7 @@ test('fetch', async () => {
test('fetch a package from Git sub folder', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap } = await fetch(
createCafsStore(storeDir),
{
@@ -73,7 +85,7 @@ test('fetch a package from Git sub folder', async () => {
test('prevent directory traversal attack when using Git sub folder', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const repo = 'https://github.com/RexSkz/test-git-subfolder-fetch.git'
const pkgDir = '../../etc'
await expect(
@@ -94,7 +106,7 @@ test('prevent directory traversal attack when using Git sub folder', async () =>
test('prevent directory traversal attack when using Git sub folder #2', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const repo = 'https://github.com/RexSkz/test-git-subfolder-fetch.git'
const pkgDir = 'not/exists'
await expect(
@@ -117,6 +129,7 @@ test('fetch a package from Git that has a prepare script', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({
rawConfig: {},
storeIndex: createStoreIndex(storeDir),
}).git
const { filesMap } = await fetch(
createCafsStore(storeDir),
@@ -136,7 +149,7 @@ test('fetch a package from Git that has a prepare script', async () => {
// Test case for https://github.com/pnpm/pnpm/issues/1866
test('fetch a package without a package.json', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap } = await fetch(
createCafsStore(storeDir),
{
@@ -155,7 +168,7 @@ test('fetch a package without a package.json', async () => {
// Covers the regression reported in https://github.com/pnpm/pnpm/issues/4064
test('fetch a big repository', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap } = await fetch(createCafsStore(storeDir),
{
commit: 'a65fbf5a90f53c9d72fed4daaca59da50f074355',
@@ -169,7 +182,7 @@ test('fetch a big repository', async () => {
test('still able to shallow fetch for allowed hosts', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ gitShallowHosts: ['github.com'], rawConfig: {} }).git
const fetch = createGitFetcher({ gitShallowHosts: ['github.com'], rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const resolution = {
commit: 'c9b30e71d704cd30fa71f2edd1ecc7dcc4985493',
repo: 'https://github.com/kevva/is-positive.git',
@@ -200,6 +213,7 @@ test('fail when preparing a git-hosted package', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({
rawConfig: {},
storeIndex: createStoreIndex(storeDir),
}).git
await expect(
fetch(createCafsStore(storeDir),
@@ -218,6 +232,7 @@ test('fail when preparing a git-hosted package with a partial commit', async ()
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({
rawConfig: {},
storeIndex: createStoreIndex(storeDir),
}).git
await expect(
fetch(createCafsStore(storeDir),
@@ -233,7 +248,7 @@ test('fail when preparing a git-hosted package with a partial commit', async ()
test('do not build the package when scripts are ignored', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ ignoreScripts: true, rawConfig: {} }).git
const fetch = createGitFetcher({ ignoreScripts: true, rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap } = await fetch(createCafsStore(storeDir),
{
commit: '55416a9c468806a935636c0ad0371a14a64df8c9',
@@ -249,7 +264,7 @@ test('do not build the package when scripts are ignored', async () => {
test('block git package with prepare script', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const repo = 'https://github.com/pnpm-e2e/prepare-script-works.git'
await expect(
fetch(createCafsStore(storeDir),
@@ -268,6 +283,7 @@ test('allow git package with prepare script', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({
rawConfig: {},
storeIndex: createStoreIndex(storeDir),
}).git
// This should succeed without throwing because the package is in the allowlist
const { filesMap } = await fetch(createCafsStore(storeDir),
@@ -290,7 +306,7 @@ function prefixGitArgs (): string[] {
test('fetch only the included files', async () => {
const storeDir = temporaryDirectory()
const fetch = createGitFetcher({ rawConfig: {} }).git
const fetch = createGitFetcher({ rawConfig: {}, storeIndex: createStoreIndex(storeDir) }).git
const { filesMap } = await fetch(
createCafsStore(storeDir),
{

View File

@@ -30,6 +30,9 @@
{
"path": "../../store/create-cafs-store"
},
{
"path": "../../store/index"
},
{
"path": "../../worker"
},

View File

@@ -42,6 +42,7 @@
"@pnpm/create-cafs-store": "workspace:*",
"@pnpm/fetch": "workspace:*",
"@pnpm/pick-fetcher": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/tarball-fetcher": "workspace:*",
"@pnpm/test-fixtures": "workspace:*",
"nock": "catalog:",

View File

@@ -3,6 +3,7 @@ import { jest } from '@jest/globals'
import { createTarballFetcher } from '@pnpm/tarball-fetcher'
import { createFetchFromRegistry } from '@pnpm/fetch'
import { createCafsStore } from '@pnpm/create-cafs-store'
import { StoreIndex } from '@pnpm/store.index'
import { fixtures } from '@pnpm/test-fixtures'
import { temporaryDirectory } from 'tempy'
import path from 'path'
@@ -13,6 +14,10 @@ import type { AtomicResolution } from '@pnpm/resolver-base'
import type { CustomFetcher } from '@pnpm/hooks.types'
const f = fixtures(import.meta.dirname)
const storeIndex = new StoreIndex(temporaryDirectory())
afterAll(() => {
storeIndex.close()
})
// Test helpers to reduce type casting
function createMockFetchers (partial: Partial<Fetchers> = {}): Fetchers {
@@ -280,7 +285,7 @@ describe('custom fetcher implementation examples', () => {
const tarballFetchers = createTarballFetcher(
fetchFromRegistry,
() => undefined,
{ rawConfig: {} }
{ rawConfig: {}, storeIndex }
)
// Custom fetcher that maps custom URLs to tarballs
@@ -328,7 +333,7 @@ describe('custom fetcher implementation examples', () => {
const tarballFetchers = createTarballFetcher(
fetchFromRegistry,
() => undefined,
{ rawConfig: {} }
{ rawConfig: {}, storeIndex }
)
// Custom fetcher that maps custom local paths to tarballs
@@ -379,7 +384,7 @@ describe('custom fetcher implementation examples', () => {
const tarballFetchers = createTarballFetcher(
fetchFromRegistry,
() => undefined,
{ rawConfig: {} }
{ rawConfig: {}, storeIndex }
)
// Custom fetcher that transforms custom resolution to tarball URL
@@ -428,7 +433,7 @@ describe('custom fetcher implementation examples', () => {
const tarballFetchers = createTarballFetcher(
fetchFromRegistry,
() => undefined,
{ rawConfig: {}, ignoreScripts: true }
{ rawConfig: {}, storeIndex, ignoreScripts: true }
)
// Custom fetcher that maps custom git resolution to git-hosted tarball

View File

@@ -30,6 +30,9 @@
{
"path": "../../store/create-cafs-store"
},
{
"path": "../../store/index"
},
{
"path": "../fetcher-base"
},

View File

@@ -40,13 +40,12 @@
"@pnpm/fs.packlist": "workspace:*",
"@pnpm/graceful-fs": "workspace:*",
"@pnpm/prepare-package": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"@zkochan/retry": "catalog:",
"lodash.throttle": "catalog:",
"p-map-values": "catalog:",
"path-temp": "catalog:",
"ramda": "catalog:",
"rename-overwrite": "catalog:"
"ramda": "catalog:"
},
"peerDependencies": {
"@pnpm/logger": "catalog:",

View File

@@ -1,15 +1,13 @@
import assert from 'assert'
import fs from 'node:fs/promises'
import util from 'util'
import { type FetchFunction, type FetchOptions } from '@pnpm/fetcher-base'
import { type Cafs, type FilesMap } from '@pnpm/cafs-types'
import { packlist } from '@pnpm/fs.packlist'
import { globalWarn } from '@pnpm/logger'
import { preparePackage } from '@pnpm/prepare-package'
import { type StoreIndex } from '@pnpm/store.index'
import { type BundledManifest } from '@pnpm/types'
import { addFilesFromDir } from '@pnpm/worker'
import renameOverwrite from 'rename-overwrite'
import { fastPathTemp as pathTemp } from 'path-temp'
interface Resolution {
integrity?: string
@@ -21,18 +19,22 @@ interface Resolution {
export interface CreateGitHostedTarballFetcher {
ignoreScripts?: boolean
rawConfig: Record<string, unknown>
storeIndex: StoreIndex
unsafePerm?: boolean
}
export function createGitHostedTarballFetcher (fetchRemoteTarball: FetchFunction, fetcherOpts: CreateGitHostedTarballFetcher): FetchFunction {
const fetch = async (cafs: Cafs, resolution: Resolution, opts: FetchOptions) => {
const tempIndexFile = pathTemp(opts.filesIndexFile)
const rawFilesIndexFile = `${opts.filesIndexFile}\traw`
const { filesMap, manifest, requiresBuild } = await fetchRemoteTarball(cafs, resolution, {
...opts,
filesIndexFile: tempIndexFile,
filesIndexFile: rawFilesIndexFile,
})
// Flush any queued store index writes so that the raw files index entry
// written during tarball extraction is visible to subsequent reads.
fetcherOpts.storeIndex.flush()
try {
const prepareResult = await prepareGitHostedPkg(filesMap, cafs, tempIndexFile, opts.filesIndexFile, fetcherOpts, opts, resolution)
const prepareResult = await prepareGitHostedPkg(filesMap, cafs, rawFilesIndexFile, opts.filesIndexFile, fetcherOpts, opts, resolution)
if (prepareResult.ignoredBuild) {
globalWarn(`The git-hosted package fetched from "${resolution.tarball}" has to be built but the build scripts were ignored.`)
}
@@ -60,7 +62,7 @@ interface PrepareGitHostedPkgResult {
async function prepareGitHostedPkg (
filesMap: FilesMap,
cafs: Cafs,
filesIndexFileNonBuilt: string,
rawFilesIndexFile: string,
filesIndexFile: string,
opts: CreateGitHostedTarballFetcher,
fetcherOpts: FetchOptions,
@@ -80,10 +82,13 @@ async function prepareGitHostedPkg (
allowBuild: fetcherOpts.allowBuild,
}, tempLocation, resolution.path ?? '')
const files = await packlist(pkgDir)
const { storeIndex } = opts
if (!resolution.path && files.length === filesMap.size) {
if (!shouldBeBuilt) {
if (filesIndexFileNonBuilt !== filesIndexFile) {
await renameOverwrite(filesIndexFileNonBuilt, filesIndexFile)
const data = storeIndex.get(rawFilesIndexFile)
if (data) {
storeIndex.set(filesIndexFile, data)
storeIndex.delete(rawFilesIndexFile)
}
return {
filesMap,
@@ -91,22 +96,21 @@ async function prepareGitHostedPkg (
}
}
if (opts.ignoreScripts) {
storeIndex.delete(rawFilesIndexFile)
return {
filesMap,
ignoredBuild: true,
}
}
}
try {
// The temporary index file may be deleted
await fs.unlink(filesIndexFileNonBuilt)
} catch {}
storeIndex.delete(rawFilesIndexFile)
// Important! We cannot remove the temp location at this stage.
// Even though we have the index of the package,
// the linking of files to the store is in progress.
return {
...await addFilesFromDir({
storeDir: cafs.storeDir,
storeIndex: opts.storeIndex,
dir: pkgDir,
files,
filesIndexFile,

View File

@@ -10,6 +10,7 @@ import {
type GetAuthHeader,
type RetryTimeoutOptions,
} from '@pnpm/fetching-types'
import { type StoreIndex } from '@pnpm/store.index'
import { TarballIntegrityError } from '@pnpm/worker'
import {
createDownloader,
@@ -41,6 +42,7 @@ export function createTarballFetcher (
rawConfig: Record<string, unknown>
unsafePerm?: boolean
ignoreScripts?: boolean
storeIndex: StoreIndex
timeout?: number
retry?: RetryTimeoutOptions
offline?: boolean
@@ -56,10 +58,11 @@ export function createTarballFetcher (
download,
getAuthHeaderByURI: getAuthHeader,
offline: opts.offline,
storeIndex: opts.storeIndex,
}) as FetchFunction
return {
localTarball: createLocalTarballFetcher(),
localTarball: createLocalTarballFetcher(opts.storeIndex),
remoteTarball: remoteTarballFetcher,
gitHostedTarball: createGitHostedTarballFetcher(remoteTarballFetcher, opts),
}
@@ -70,6 +73,7 @@ async function fetchFromTarball (
download: DownloadFunction
getAuthHeaderByURI: (registry: string) => string | undefined
offline?: boolean
storeIndex: StoreIndex
},
cafs: Cafs,
resolution: {
@@ -86,6 +90,7 @@ async function fetchFromTarball (
return ctx.download(resolution.tarball, {
getAuthHeaderByURI: ctx.getAuthHeaderByURI,
cafs,
storeIndex: ctx.storeIndex,
integrity: resolution.integrity,
readManifest: opts.readManifest,
onProgress: opts.onProgress,

View File

@@ -2,6 +2,7 @@ import path from 'path'
import { type FetchFunction, type FetchOptions } from '@pnpm/fetcher-base'
import type { Cafs } from '@pnpm/cafs-types'
import gfs from '@pnpm/graceful-fs'
import { type StoreIndex } from '@pnpm/store.index'
import { addFilesFromTarball } from '@pnpm/worker'
const isAbsolutePath = /^\/|^[A-Z]:/i
@@ -12,12 +13,13 @@ interface Resolution {
tarball: string
}
export function createLocalTarballFetcher (): FetchFunction {
export function createLocalTarballFetcher (storeIndex: StoreIndex): FetchFunction {
const fetch = (cafs: Cafs, resolution: Resolution, opts: FetchOptions) => {
const tarball = resolvePath(opts.lockfileDir, resolution.tarball.slice(5))
const buffer = gfs.readFileSync(tarball)
return addFilesFromTarball({
storeDir: cafs.storeDir,
storeIndex,
buffer,
filesIndexFile: opts.filesIndexFile,
integrity: resolution.integrity,

View File

@@ -7,6 +7,7 @@ import { type FetchResult, type FetchOptions } from '@pnpm/fetcher-base'
import { type Cafs } from '@pnpm/cafs-types'
import { type FetchFromRegistry } from '@pnpm/fetching-types'
import { globalWarn } from '@pnpm/logger'
import { type StoreIndex } from '@pnpm/store.index'
import { addFilesFromTarball } from '@pnpm/worker'
import * as retry from '@zkochan/retry'
import throttle from 'lodash.throttle'
@@ -25,6 +26,7 @@ export type DownloadOptions = {
onStart?: (totalSize: number | null, attempt: number) => void
onProgress?: (downloaded: number) => void
integrity?: string
storeIndex: StoreIndex
} & Pick<FetchOptions, 'pkg' | 'appendManifest' | 'readManifest' | 'filesIndexFile'>
export type DownloadFunction = (url: string, opts: DownloadOptions) => Promise<FetchResult>
@@ -165,6 +167,7 @@ export function createDownloader (
return addFilesFromTarball({
buffer: data,
storeDir: opts.cafs.storeDir,
storeIndex: opts.storeIndex,
readManifest: opts.readManifest,
integrity: opts.integrity,
filesIndexFile: opts.filesIndexFile,

View File

@@ -7,6 +7,7 @@ import { createFetchFromRegistry } from '@pnpm/fetch'
import { createCafsStore } from '@pnpm/create-cafs-store'
import { fixtures } from '@pnpm/test-fixtures'
import { lexCompare } from '@pnpm/util.lex-comparator'
import { StoreIndex } from '@pnpm/store.index'
import nock from 'nock'
import ssri from 'ssri'
import { temporaryDirectory } from 'tempy'
@@ -34,6 +35,11 @@ beforeEach(() => {
const storeDir = temporaryDirectory()
const filesIndexFile = path.join(storeDir, 'index.json')
const cafs = createCafsStore(storeDir)
const storeIndex = new StoreIndex(storeDir)
afterAll(() => {
storeIndex.close()
})
const f = fixtures(import.meta.dirname)
const tarballPath = f.find('babel-helper-hoist-variables-6.24.1.tgz')
@@ -44,6 +50,7 @@ const fetchFromRegistry = createFetchFromRegistry({})
const getAuthHeader = () => undefined
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -239,6 +246,7 @@ test("don't fail when fetching a local tarball in offline mode", async () => {
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
offline: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -266,6 +274,7 @@ test('fail when trying to fetch a non-local tarball in offline mode', async () =
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
offline: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -396,6 +405,7 @@ test('accessing private packages', async () => {
const getAuthHeader = () => 'Bearer ofjergrg349gj3f2'
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -504,6 +514,7 @@ test('do not build the package when scripts are ignored', async () => {
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
ignoreScripts: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -549,6 +560,7 @@ test('use the subfolder when path is present', async () => {
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
ignoreScripts: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -575,6 +587,7 @@ test('prevent directory traversal attack when path is present', async () => {
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
ignoreScripts: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,
@@ -599,6 +612,7 @@ test('fail when path is not exists', async () => {
const fetch = createTarballFetcher(fetchFromRegistry, getAuthHeader, {
ignoreScripts: true,
rawConfig: {},
storeIndex,
retry: {
maxTimeout: 100,
minTimeout: 0,

View File

@@ -45,6 +45,9 @@
{
"path": "../../store/create-cafs-store"
},
{
"path": "../../store/index"
},
{
"path": "../../worker"
},

View File

@@ -38,12 +38,12 @@
"dependencies": {
"@pnpm/config": "workspace:*",
"@pnpm/dependency-path": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/lockfile.fs": "workspace:*",
"@pnpm/lockfile.utils": "workspace:*",
"@pnpm/logger": "workspace:*",
"@pnpm/store-path": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"hyperdrive-schemas": "catalog:",
"normalize-path": "catalog:"

View File

@@ -1,7 +1,7 @@
// cspell:ignore ents
import fs from 'fs'
import { readMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { getIndexFilePathInCafs, getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type LockfileObject, readWantedLockfile, type PackageSnapshot, type TarballResolution } from '@pnpm/lockfile.fs'
import {
nameVerFromPkgSnapshot,
@@ -39,6 +39,7 @@ export async function createFuseHandlers (lockfileDir: string, storeDir: string)
}
export function createFuseHandlersFromLockfile (lockfile: LockfileObject, storeDir: string): FuseHandlers {
const storeIndex = new StoreIndex(storeDir)
const pkgSnapshotCache = new Map<string, { name: string, version: string, pkgSnapshot: PackageSnapshot, index: PackageFilesIndex }>()
const virtualNodeModules = makeVirtualNodeModules(lockfile)
return {
@@ -156,7 +157,7 @@ export function createFuseHandlersFromLockfile (lockfile: LockfileObject, storeD
currentDirEntry = currentDirEntry.entries[parts.shift()!]
}
if (currentDirEntry?.entryType === 'index') {
const pkg = getPkgInfo(currentDirEntry.depPath, storeDir)
const pkg = getPkgInfo(currentDirEntry.depPath)
if (pkg == null) {
return null
}
@@ -168,13 +169,13 @@ export function createFuseHandlersFromLockfile (lockfile: LockfileObject, storeD
}
return currentDirEntry
}
function getPkgInfo (depPath: string, storeDir: string) {
function getPkgInfo (depPath: string) {
if (!pkgSnapshotCache.has(depPath)) {
const pkgSnapshot = lockfile.packages?.[depPath as DepPath]
if (pkgSnapshot == null) return undefined
const nameVer = nameVerFromPkgSnapshot(depPath, pkgSnapshot)
const pkgIndexFilePath = getIndexFilePathInCafs(storeDir, (pkgSnapshot.resolution as TarballResolution).integrity!, `${nameVer.name}@${nameVer.version}`)
const pkgIndex = readMsgpackFileSync<PackageFilesIndex>(pkgIndexFilePath) // TODO: maybe make it async?
const pkgIndexFilePath = storeIndexKey((pkgSnapshot.resolution as TarballResolution).integrity!, `${nameVer.name}@${nameVer.version}`)
const pkgIndex = storeIndex.get(pkgIndexFilePath) as PackageFilesIndex
pkgSnapshotCache.set(depPath, {
...nameVer,
pkgSnapshot,

View File

@@ -12,9 +12,6 @@
{
"path": "../../config/config"
},
{
"path": "../../fs/msgpack-file"
},
{
"path": "../../lockfile/fs"
},
@@ -36,6 +33,9 @@
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
},
{
"path": "../../store/store-path"
}

View File

@@ -43,6 +43,7 @@
"@pnpm/network.auth-header": "workspace:*",
"@pnpm/node.fetcher": "workspace:*",
"@pnpm/resolver-base": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/tarball-fetcher": "workspace:*",
"@pnpm/types": "workspace:*",
"ramda": "catalog:"

View File

@@ -12,6 +12,7 @@ import { createDirectoryFetcher } from '@pnpm/directory-fetcher'
import { createGitFetcher } from '@pnpm/git-fetcher'
import { createTarballFetcher, type TarballFetchers } from '@pnpm/tarball-fetcher'
import { createGetAuthHeaderByURI } from '@pnpm/network.auth-header'
import { type StoreIndex } from '@pnpm/store.index'
import { createBinaryFetcher } from '@pnpm/fetching.binary-fetcher'
export type { ResolveFunction }
@@ -24,6 +25,7 @@ export type ClientOptions = {
rawConfig: Record<string, string>
sslConfigs?: Record<string, SslConfig>
retry?: RetryTimeoutOptions
storeIndex: StoreIndex
timeout?: number
unsafePerm?: boolean
userAgent?: string
@@ -53,7 +55,7 @@ export function createClient (opts: ClientOptions): Client {
}
}
export function createResolver (opts: ClientOptions): { resolve: ResolveFunction, clearCache: () => void } {
export function createResolver (opts: Omit<ClientOptions, 'storeIndex'>): { resolve: ResolveFunction, clearCache: () => void } {
const fetchFromRegistry = createFetchFromRegistry(opts)
const getAuthHeader = createGetAuthHeaderByURI({ allSettings: opts.authConfig, userSettings: opts.userConfig })
@@ -69,7 +71,7 @@ type Fetchers = {
function createFetchers (
fetchFromRegistry: FetchFromRegistry,
getAuthHeader: GetAuthHeader,
opts: Pick<ClientOptions, 'rawConfig' | 'retry' | 'gitShallowHosts' | 'resolveSymlinksInInjectedDirs' | 'unsafePerm' | 'includeOnlyPackageFiles' | 'offline' | 'fetchMinSpeedKiBps'>
opts: Pick<ClientOptions, 'rawConfig' | 'retry' | 'gitShallowHosts' | 'resolveSymlinksInInjectedDirs' | 'unsafePerm' | 'includeOnlyPackageFiles' | 'offline' | 'fetchMinSpeedKiBps' | 'storeIndex'>
): Fetchers {
const tarballFetchers = createTarballFetcher(fetchFromRegistry, getAuthHeader, opts)
return {
@@ -81,6 +83,7 @@ function createFetchers (
fetchFromRemoteTarball: tarballFetchers.remoteTarball,
offline: opts.offline,
rawConfig: opts.rawConfig,
storeIndex: opts.storeIndex,
}),
}
}

View File

@@ -1,7 +1,15 @@
/// <reference path="../../../__typings__/index.d.ts"/>
import { createClient, createResolver } from '@pnpm/client'
import { StoreIndex } from '@pnpm/store.index'
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test('createClient()', () => {
const storeIndex = new StoreIndex('.store')
storeIndexes.push(storeIndex)
const client = createClient({
authConfig: { registry: 'https://registry.npmjs.org/' },
cacheDir: '',
@@ -10,6 +18,7 @@ test('createClient()', () => {
default: 'https://reigstry.npmjs.org/',
},
storeDir: '.store',
storeIndex,
})
expect(typeof client === 'object').toBeTruthy()
})

View File

@@ -47,6 +47,9 @@
},
{
"path": "../../resolving/resolver-base"
},
{
"path": "../../store/index"
}
]
}

View File

@@ -137,6 +137,7 @@
"@pnpm/registry-mock": "catalog:",
"@pnpm/store-path": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/test-fixtures": "workspace:*",
"@pnpm/test-ipc-server": "workspace:*",
"@pnpm/testing.temp-store": "workspace:*",

View File

@@ -5,8 +5,9 @@ import { ENGINE_NAME } from '@pnpm/constants'
import { install } from '@pnpm/core'
import { type IgnoredScriptsLog } from '@pnpm/core-loggers'
import { createHexHashFromFile } from '@pnpm/crypto.hash'
import { readMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { prepareEmpty } from '@pnpm/prepare'
import { getIntegrity } from '@pnpm/registry-mock'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { fixtures } from '@pnpm/test-fixtures'
import { jest } from '@jest/globals'
import { sync as rimraf } from '@zkochan/rimraf'
@@ -14,6 +15,11 @@ import { testDefaults } from '../utils/index.js'
const f = fixtures(import.meta.dirname)
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test('patch package with exact version', async () => {
const reporter = jest.fn()
const project = prepareEmpty()
@@ -55,8 +61,10 @@ test('patch package with exact version', async () => {
})
expect(lockfile.snapshots[`is-positive@1.0.0(patch_hash=${patchFileHash})`]).toBeTruthy()
const filesIndexFile = path.join(opts.storeDir, 'index/c7/1ccf199e0fdae37aad13946b937d67bcd35fa111b84d21b3a19439cfdc2812-is-positive@1.0.0.mpk')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey = storeIndexKey(getIntegrity('is-positive', '1.0.0'), 'is-positive@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const filesIndex = storeIndex.get(filesIndexKey) as PackageFilesIndex
expect(filesIndex.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};patch=${patchFileHash}`
expect(filesIndex.sideEffects!.has(sideEffectsKey)).toBeTruthy()
@@ -153,8 +161,10 @@ test('patch package with version range', async () => {
})
expect(lockfile.snapshots[`is-positive@1.0.0(patch_hash=${patchFileHash})`]).toBeTruthy()
const filesIndexFile = path.join(opts.storeDir, 'index/c7/1ccf199e0fdae37aad13946b937d67bcd35fa111b84d21b3a19439cfdc2812-is-positive@1.0.0.mpk')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey = storeIndexKey(getIntegrity('is-positive', '1.0.0'), 'is-positive@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const filesIndex = storeIndex.get(filesIndexKey) as PackageFilesIndex
expect(filesIndex.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};patch=${patchFileHash}`
expect(filesIndex.sideEffects!.has(sideEffectsKey)).toBeTruthy()
@@ -323,8 +333,10 @@ test('patch package when scripts are ignored', async () => {
})
expect(lockfile.snapshots[`is-positive@1.0.0(patch_hash=${patchFileHash})`]).toBeTruthy()
const filesIndexFile = path.join(opts.storeDir, 'index/c7/1ccf199e0fdae37aad13946b937d67bcd35fa111b84d21b3a19439cfdc2812-is-positive@1.0.0.mpk')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey = storeIndexKey(getIntegrity('is-positive', '1.0.0'), 'is-positive@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const filesIndex = storeIndex.get(filesIndexKey) as PackageFilesIndex
expect(filesIndex.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};patch=${patchFileHash}`
expect(filesIndex.sideEffects!.has(sideEffectsKey)).toBeTruthy()
@@ -414,8 +426,10 @@ test('patch package when the package is not in allowBuilds list', async () => {
})
expect(lockfile.snapshots[`is-positive@1.0.0(patch_hash=${patchFileHash})`]).toBeTruthy()
const filesIndexFile = path.join(opts.storeDir, 'index/c7/1ccf199e0fdae37aad13946b937d67bcd35fa111b84d21b3a19439cfdc2812-is-positive@1.0.0.mpk')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey = storeIndexKey(getIntegrity('is-positive', '1.0.0'), 'is-positive@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const filesIndex = storeIndex.get(filesIndexKey) as PackageFilesIndex
expect(filesIndex.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};patch=${patchFileHash}`
expect(filesIndex.sideEffects!.has(sideEffectsKey)).toBeTruthy()

View File

@@ -2,8 +2,8 @@ import fs from 'fs'
import path from 'path'
import { addDependenciesToPackage, install } from '@pnpm/core'
import { hashObject } from '@pnpm/crypto.object-hasher'
import { readMsgpackFileSync, writeMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { getIndexFilePathInCafs, getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { getIntegrity, REGISTRY_MOCK_PORT } from '@pnpm/registry-mock'
import { prepareEmpty } from '@pnpm/prepare'
import { ENGINE_NAME } from '@pnpm/constants'
@@ -12,6 +12,11 @@ import { testDefaults } from '../utils/index.js'
const ENGINE_DIR = `${process.platform}-${process.arch}-node-${process.version.split('.')[0]}`
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test.skip('caching side effects of native package', async () => {
prepareEmpty()
@@ -82,8 +87,10 @@ test('using side effects cache', async () => {
}, {}, {}, { packageImportMethod: 'copy' })
const { updatedManifest: manifest } = await addDependenciesToPackage({}, ['@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0'], opts)
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const filesIndex = storeIndex.get(filesIndexKey) as PackageFilesIndex
expect(filesIndex.sideEffects).toBeTruthy() // files index has side effects
const sideEffectsKey = `${ENGINE_NAME};deps=${hashObject({
id: `@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0:${getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0')}`,
@@ -101,7 +108,7 @@ test('using side effects cache', async () => {
expect(addedFiles.has('generated-by-preinstall.js')).toBeTruthy()
expect(addedFiles.has('generated-by-postinstall.js')).toBeTruthy()
addedFiles.delete('generated-by-postinstall.js')
writeMsgpackFileSync(filesIndexFile, filesIndex)
storeIndex.set(filesIndexKey, filesIndex)
rimraf('node_modules')
rimraf('pnpm-lock.yaml') // to avoid headless install
@@ -169,9 +176,11 @@ test('uploading errors do not interrupt installation', async () => {
expect(fs.existsSync('node_modules/@pnpm.e2e/pre-and-postinstall-scripts-example/generated-by-postinstall.js')).toBeTruthy()
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
expect(filesIndex.sideEffects).toBeFalsy()
const filesIndexKey2 = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex2 = new StoreIndex(opts.storeDir)
const filesIndex2 = storeIndex2.get(filesIndexKey2) as PackageFilesIndex
storeIndex2.close()
expect(filesIndex2.sideEffects).toBeFalsy()
})
test('a postinstall script does not modify the original sources added to the store', async () => {
@@ -187,8 +196,10 @@ test('a postinstall script does not modify the original sources added to the sto
expect(fs.readFileSync('node_modules/@pnpm/postinstall-modifies-source/empty-file.txt', 'utf8')).toContain('hello')
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, getIntegrity('@pnpm/postinstall-modifies-source', '1.0.0'), '@pnpm/postinstall-modifies-source@1.0.0')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
const filesIndexKey3 = storeIndexKey(getIntegrity('@pnpm/postinstall-modifies-source', '1.0.0'), '@pnpm/postinstall-modifies-source@1.0.0')
const storeIndex3 = new StoreIndex(opts.storeDir)
const filesIndex = storeIndex3.get(filesIndexKey3) as PackageFilesIndex
storeIndex3.close()
expect(filesIndex.sideEffects).toBeTruthy()
expect(filesIndex.sideEffects?.has(`${ENGINE_NAME};deps=${hashObject({
id: `@pnpm/postinstall-modifies-source@1.0.0:${getIntegrity('@pnpm/postinstall-modifies-source', '1.0.0')}`,
@@ -219,9 +230,11 @@ test('a corrupted side-effects cache is ignored', async () => {
})
const { updatedManifest: manifest } = await addDependenciesToPackage({}, ['@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0'], opts)
const filesIndexFile = getIndexFilePathInCafs(opts.storeDir, getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const filesIndex = readMsgpackFileSync<PackageFilesIndex>(filesIndexFile)
expect(filesIndex.sideEffects).toBeTruthy() // files index has side effects
const filesIndexKey4 = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex4 = new StoreIndex(opts.storeDir)
const filesIndex4 = storeIndex4.get(filesIndexKey4) as PackageFilesIndex
storeIndex4.close()
expect(filesIndex4.sideEffects).toBeTruthy() // files index has side effects
const sideEffectsKey = `${ENGINE_NAME};deps=${hashObject({
id: `@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0:${getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0')}`,
deps: {
@@ -232,11 +245,11 @@ test('a corrupted side-effects cache is ignored', async () => {
},
})}`
expect(filesIndex.sideEffects).toBeTruthy()
expect(filesIndex.sideEffects!.has(sideEffectsKey)).toBeTruthy()
expect(filesIndex.sideEffects!.get(sideEffectsKey)!.added).toBeTruthy()
expect(filesIndex.sideEffects!.get(sideEffectsKey)!.added!.has('generated-by-preinstall.js')).toBeTruthy()
const sideEffectFileStat = filesIndex.sideEffects!.get(sideEffectsKey)!.added!.get('generated-by-preinstall.js')!
expect(filesIndex4.sideEffects).toBeTruthy()
expect(filesIndex4.sideEffects!.has(sideEffectsKey)).toBeTruthy()
expect(filesIndex4.sideEffects!.get(sideEffectsKey)!.added).toBeTruthy()
expect(filesIndex4.sideEffects!.get(sideEffectsKey)!.added!.has('generated-by-preinstall.js')).toBeTruthy()
const sideEffectFileStat = filesIndex4.sideEffects!.get(sideEffectsKey)!.added!.get('generated-by-preinstall.js')!
const sideEffectFile = getFilePathByModeInCafs(opts.storeDir, sideEffectFileStat.digest, sideEffectFileStat.mode)
expect(fs.existsSync(sideEffectFile)).toBeTruthy()
rimraf(sideEffectFile) // we remove the side effect file to break the store

View File

@@ -153,6 +153,9 @@
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
},
{
"path": "../../store/store-controller-types"
},

View File

@@ -81,7 +81,6 @@
"@jest/globals": "catalog:",
"@pnpm/assert-project": "workspace:*",
"@pnpm/crypto.object-hasher": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/headless": "workspace:*",
"@pnpm/logger": "workspace:*",
"@pnpm/prepare": "workspace:*",
@@ -89,6 +88,7 @@
"@pnpm/registry-mock": "catalog:",
"@pnpm/store-path": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/test-fixtures": "workspace:*",
"@pnpm/test-ipc-server": "workspace:*",
"@pnpm/testing.temp-store": "workspace:*",

View File

@@ -664,8 +664,6 @@ export async function headlessInstall (opts: HeadlessOptions): Promise<Installat
summaryLogger.debug({ prefix: lockfileDir })
await opts.storeController.close()
if (!opts.ignoreScripts && !opts.ignorePackageManifest) {
await runLifecycleHooksConcurrently(
['preinstall', 'install', 'postinstall', 'preprepare', 'prepare', 'postprepare'],

View File

@@ -3,7 +3,7 @@ import fs from 'fs'
import path from 'path'
import { assertProject } from '@pnpm/assert-project'
import { hashObject } from '@pnpm/crypto.object-hasher'
import { getIndexFilePathInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { ENGINE_NAME, WANTED_LOCKFILE } from '@pnpm/constants'
import {
type PackageManifestLog,
@@ -11,7 +11,7 @@ import {
type StageLog,
type StatsLog,
} from '@pnpm/core-loggers'
import { readMsgpackFileSync, writeMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { headlessInstall } from '@pnpm/headless'
import { readWantedLockfile } from '@pnpm/lockfile.fs'
import { readModulesManifest } from '@pnpm/modules-yaml'
@@ -27,6 +27,11 @@ import { testDefaults } from './utils/testDefaults.js'
const f = fixtures(import.meta.dirname)
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test('installing a simple project', async () => {
const prefix = f.prepare('simple')
const reporter = jest.fn()
@@ -696,8 +701,10 @@ test.each([['isolated'], ['hoisted']])('using side effects cache with nodeLinker
}, {}, {}, { packageImportMethod: 'copy' })
await headlessInstall(opts)
const cacheIntegrityPath = getIndexFilePathInCafs(opts.storeDir, getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const cacheIntegrity = readMsgpackFileSync<PackageFilesIndex>(cacheIntegrityPath)
const cacheIntegrityPath = storeIndexKey(getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0'), '@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0')
const storeIndex = new StoreIndex(opts.storeDir)
storeIndexes.push(storeIndex)
const cacheIntegrity = storeIndex.get(cacheIntegrityPath) as PackageFilesIndex
expect(cacheIntegrity!.sideEffects).toBeTruthy()
const sideEffectsKey = `${ENGINE_NAME};deps=${hashObject({
id: `@pnpm.e2e/pre-and-postinstall-scripts-example@1.0.0:${getIntegrity('@pnpm.e2e/pre-and-postinstall-scripts-example', '1.0.0')}`,
@@ -712,7 +719,7 @@ test.each([['isolated'], ['hoisted']])('using side effects cache with nodeLinker
cacheIntegrity!.sideEffects!.get(sideEffectsKey)!.added!.delete('generated-by-postinstall.js')
expect(cacheIntegrity!.sideEffects!.get(sideEffectsKey)?.added?.has('generated-by-preinstall.js')).toBeTruthy()
writeMsgpackFileSync(cacheIntegrityPath, cacheIntegrity)
storeIndex.set(cacheIntegrityPath, cacheIntegrity)
prefix = f.prepare('side-effects')
const opts2 = await testDefaults({

View File

@@ -39,9 +39,6 @@
{
"path": "../../exec/lifecycle"
},
{
"path": "../../fs/msgpack-file"
},
{
"path": "../../fs/symlink-dependency"
},
@@ -90,6 +87,9 @@
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
},
{
"path": "../../store/store-controller-types"
},

View File

@@ -45,6 +45,7 @@
"@pnpm/resolver-base": "workspace:*",
"@pnpm/store-controller-types": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"detect-libc": "catalog:",
"load-json-file": "catalog:",

View File

@@ -1,9 +1,9 @@
import { createReadStream, promises as fs } from 'fs'
import path from 'path'
import {
getIndexFilePathInCafs as _getIndexFilePathInCafs,
normalizeBundledManifest,
} from '@pnpm/store.cafs'
import { gitHostedStoreIndexKey, storeIndexKey } from '@pnpm/store.index'
import { fetchingProgressLogger, progressLogger } from '@pnpm/core-loggers'
import { pickFetcher } from '@pnpm/pick-fetcher'
import { PnpmError } from '@pnpm/error'
@@ -96,7 +96,6 @@ export function createPackageRequester (
concurrency: networkConcurrency,
})
const getIndexFilePathInCafs = _getIndexFilePathInCafs.bind(null, opts.storeDir)
const fetch = fetcher.bind(null, opts.fetchers, opts.cafs, opts.customFetchers)
const readPkgFromCafs = _readPkgFromCafs.bind(null, {
storeDir: opts.storeDir,
@@ -107,7 +106,6 @@ export function createPackageRequester (
readPkgFromCafs,
fetch,
fetchingLocker: new Map(),
getIndexFilePathInCafs,
requestsQueue: Object.assign(requestsQueue, {
counter: 0,
concurrency: networkConcurrency,
@@ -130,7 +128,6 @@ export function createPackageRequester (
return Object.assign(requestPackage, {
fetchPackageToStore,
getFilesIndexFilePath: getFilesIndexFilePath.bind(null, {
getIndexFilePathInCafs,
storeDir: opts.storeDir,
virtualStoreDirMaxLength: opts.virtualStoreDirMaxLength,
}),
@@ -333,7 +330,6 @@ interface GetFilesIndexFilePathResult {
function getFilesIndexFilePath (
ctx: {
getIndexFilePathInCafs: (integrity: string, pkgId: string) => string
storeDir: string
virtualStoreDirMaxLength: number
},
@@ -344,7 +340,7 @@ function getFilesIndexFilePath (
if ((opts.pkg.resolution as TarballResolution).integrity) {
return {
target,
filesIndexFile: ctx.getIndexFilePathInCafs((opts.pkg.resolution as TarballResolution).integrity!, opts.pkg.id),
filesIndexFile: storeIndexKey((opts.pkg.resolution as TarballResolution).integrity!, opts.pkg.id),
resolution: opts.pkg.resolution as AtomicResolution,
}
}
@@ -354,14 +350,14 @@ function getFilesIndexFilePath (
if ((resolution as TarballResolution).integrity) {
return {
target,
filesIndexFile: ctx.getIndexFilePathInCafs((resolution as TarballResolution).integrity!, opts.pkg.id),
filesIndexFile: storeIndexKey((resolution as TarballResolution).integrity!, opts.pkg.id),
resolution,
}
}
} else {
resolution = opts.pkg.resolution
}
const filesIndexFile = path.join(target, opts.ignoreScripts ? 'integrity-not-built.mpk' : 'integrity.mpk')
const filesIndexFile = gitHostedStoreIndexKey(opts.pkg.id, { built: !opts.ignoreScripts })
return { filesIndexFile, target, resolution }
}
@@ -402,7 +398,6 @@ function fetchToStore (
opts: FetchOptions
) => Promise<FetchResult>
fetchingLocker: Map<string, FetchLock>
getIndexFilePathInCafs: (integrity: string, pkgId: string) => string
requestsQueue: {
add: <T>(fn: () => Promise<T>, opts: { priority: number }) => Promise<T>
counter: number

View File

@@ -3,7 +3,7 @@ import fs from 'fs'
import path from 'path'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { createClient } from '@pnpm/client'
import { readMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { StoreIndex } from '@pnpm/store.index'
import { streamParser } from '@pnpm/logger'
import { createPackageRequester, type PackageResponse } from '@pnpm/package-requester'
import { createCafsStore } from '@pnpm/create-cafs-store'
@@ -26,14 +26,36 @@ const registries = { default: registry }
const authConfig = { registry }
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
const topStoreIndex = new StoreIndex('.store')
storeIndexes.push(topStoreIndex)
const { resolve, fetchers } = createClient({
authConfig,
cacheDir: '.store',
storeDir: '.store',
rawConfig: {},
registries,
storeIndex: topStoreIndex,
})
function createFetchersForStore (storeDir: string) {
const si = new StoreIndex(storeDir)
storeIndexes.push(si)
return createClient({
authConfig,
rawConfig: {},
cacheDir: storeDir,
storeDir,
registries,
storeIndex: si,
}).fetchers
}
afterEach(() => {
nock.abortPendingRequests()
nock.cleanAll()
@@ -150,7 +172,6 @@ test('request package but skip fetching, when resolution is already available',
update: false,
}) as PackageResponse & {
body: {
latest: string
manifest: { name: string }
}
}
@@ -160,7 +181,7 @@ test('request package but skip fetching, when resolution is already available',
expect(pkgResponse.body.id).toBe('is-positive@1.0.0')
expect(pkgResponse.body.isLocal).toBe(false)
expect(typeof pkgResponse.body.latest).toBe('string')
// latest may be undefined when the resolver's fast path resolves from the store cache
expect(pkgResponse.body.manifest.name).toBe('is-positive')
expect(!pkgResponse.body.normalizedBareSpecifier).toBeTruthy()
expect(pkgResponse.body.resolution).toStrictEqual({
@@ -180,6 +201,7 @@ test('refetch local tarball if its integrity has changed', async () => {
const wantedPackage = { bareSpecifier: tarball }
const storeDir = temporaryDirectory()
const cafs = createCafsStore(storeDir)
const localFetchers = createFetchersForStore(storeDir)
const pkgId = `file:${normalize(tarballRelativePath)}`
const requestPackageOpts = {
downloadPriority: 0,
@@ -193,7 +215,7 @@ test('refetch local tarball if its integrity has changed', async () => {
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -225,7 +247,7 @@ test('refetch local tarball if its integrity has changed', async () => {
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -252,7 +274,7 @@ test('refetch local tarball if its integrity has changed', async () => {
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -287,6 +309,7 @@ test('refetch local tarball if its integrity has changed. The requester does not
const wantedPackage = { bareSpecifier: tarball }
const storeDir = path.join(projectDir, 'store')
const cafs = createCafsStore(storeDir)
const localFetchers = createFetchersForStore(storeDir)
const requestPackageOpts = {
downloadPriority: 0,
lockfileDir: projectDir,
@@ -298,7 +321,7 @@ test('refetch local tarball if its integrity has changed. The requester does not
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -321,7 +344,7 @@ test('refetch local tarball if its integrity has changed. The requester does not
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -341,7 +364,7 @@ test('refetch local tarball if its integrity has changed. The requester does not
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
storeDir,
verifyStoreIntegrity: true,
@@ -421,9 +444,10 @@ test('force fetch when resolution integrity differs from current package integri
test('fetchPackageToStore()', async () => {
const storeDir = temporaryDirectory()
const cafs = createCafsStore(storeDir)
const localFetchers = createFetchersForStore(storeDir)
const packageRequester = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -451,7 +475,9 @@ test('fetchPackageToStore()', async () => {
expect(Array.from(files.filesMap.keys()).sort((a, b) => a.localeCompare(b))).toStrictEqual(['package.json', 'index.js', 'license', 'readme.md'].sort((a, b) => a.localeCompare(b)))
expect(files.resolvedFrom).toBe('remote')
const indexFile = readMsgpackFileSync<PackageFilesIndex>(fetchResult.filesIndexFile)
const storeIndex = new StoreIndex(storeDir)
storeIndexes.push(storeIndex)
const indexFile = storeIndex.get(fetchResult.filesIndexFile) as PackageFilesIndex
expect(indexFile).toBeTruthy()
expect(typeof indexFile.files.get('package.json')!.checkedAt).toBeTruthy()
@@ -571,6 +597,7 @@ test('fetchPackageToStore() does not cache errors', async () => {
cacheDir: '.pnpm',
storeDir: '.store',
registries,
storeIndex: topStoreIndex,
})
const storeDir = temporaryDirectory()
@@ -732,6 +759,7 @@ test('fetchPackageToStore() fetch raw manifest of cached package', async () => {
test('refetch package to store if it has been modified', async () => {
const storeDir = temporaryDirectory()
const lockfileDir = temporaryDirectory()
const localFetchers = createFetchersForStore(storeDir)
const pkgId = 'magic-hook@2.0.0'
const resolution = {
@@ -743,7 +771,7 @@ test('refetch package to store if it has been modified', async () => {
const cafs = createCafsStore(storeDir)
const packageRequester = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -782,7 +810,7 @@ test('refetch package to store if it has been modified', async () => {
const cafs = createCafsStore(storeDir)
const packageRequester = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -893,12 +921,13 @@ test('fetch a git package without a package.json', async () => {
test('throw exception if the package data in the store differs from the expected data', async () => {
const storeDir = temporaryDirectory()
const cafs = createCafsStore(storeDir)
const localFetchers = createFetchersForStore(storeDir)
let pkgResponse!: PackageResponse
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -920,7 +949,7 @@ test('throw exception if the package data in the store differs from the expected
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -944,7 +973,7 @@ test('throw exception if the package data in the store differs from the expected
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,
@@ -968,7 +997,7 @@ test('throw exception if the package data in the store differs from the expected
{
const requestPackage = createPackageRequester({
resolve,
fetchers,
fetchers: localFetchers,
cafs,
networkConcurrency: 1,
storeDir,

View File

@@ -57,6 +57,9 @@
{
"path": "../../store/create-cafs-store"
},
{
"path": "../../store/index"
},
{
"path": "../../store/store-controller-types"
},

View File

@@ -101,8 +101,10 @@
"@pnpm/plugin-commands-installation": "workspace:*",
"@pnpm/prepare": "workspace:*",
"@pnpm/registry-mock": "catalog:",
"@pnpm/store.index": "workspace:*",
"@pnpm/test-fixtures": "workspace:*",
"@pnpm/test-ipc-server": "workspace:*",
"@pnpm/worker": "workspace:*",
"@pnpm/workspace.filter-packages-from-dir": "workspace:*",
"@types/normalize-path": "catalog:",
"@types/proxyquire": "catalog:",

View File

@@ -4,6 +4,8 @@ import { STORE_VERSION } from '@pnpm/constants'
import { install, fetch } from '@pnpm/plugin-commands-installation'
import { prepare } from '@pnpm/prepare'
import { REGISTRY_MOCK_PORT } from '@pnpm/registry-mock'
import { closeAllStoreIndexes } from '@pnpm/store.index'
import { finishWorkers } from '@pnpm/worker'
import { sync as rimraf } from '@zkochan/rimraf'
const REGISTRY_URL = `http://localhost:${REGISTRY_MOCK_PORT}`
@@ -204,6 +206,10 @@ test('fetch populates global virtual store links/', async () => {
storeDir,
})
// Drain workers and close SQLite connections before removing the store (required on Windows)
await finishWorkers()
closeAllStoreIndexes()
// Remove the store — simulate a cold start with only the lockfile
rimraf(storeDir)

View File

@@ -105,12 +105,18 @@
{
"path": "../../reviewing/outdated"
},
{
"path": "../../store/index"
},
{
"path": "../../store/package-store"
},
{
"path": "../../store/store-connection-manager"
},
{
"path": "../../worker"
},
{
"path": "../../workspace/filter-packages-from-dir"
},

617
pnpm-lock.yaml generated
View File

File diff suppressed because it is too large Load Diff

View File

@@ -2,7 +2,12 @@ import { build } from 'esbuild'
;(async () => {
try {
const banner = { js: `import { createRequire as _cr } from 'module';const require = _cr(import.meta.url); const __filename = import.meta.filename; const __dirname = import.meta.dirname` }
const banner = { js: [
`import { createRequire as _cr } from 'module';const require = _cr(import.meta.url); const __filename = import.meta.filename; const __dirname = import.meta.dirname;`,
// Suppress "SQLite is an experimental feature" warnings.
// Must run before any module that loads node:sqlite.
`var _ew=process.emitWarning;process.emitWarning=function(w,...a){if(String(w).includes('SQLite')&&(a[0]==='ExperimentalWarning'||(a[0]&&a[0].type==='ExperimentalWarning')))return;return _ew.call(process,w,...a)};`,
].join('') }
await build({
entryPoints: ['lib/pnpm.js'],
bundle: true,

View File

@@ -127,6 +127,7 @@
"@pnpm/read-project-manifest": "workspace:*",
"@pnpm/registry-mock": "catalog:",
"@pnpm/run-npm": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/tabtab": "catalog:",
"@pnpm/test-fixtures": "workspace:*",

View File

@@ -1,13 +1,13 @@
import fs from 'fs'
import path from 'path'
import { STORE_VERSION, WANTED_LOCKFILE } from '@pnpm/constants'
import { readMsgpackFileSync, writeMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { type LockfileObject } from '@pnpm/lockfile.types'
import { prepare, prepareEmpty, preparePackages } from '@pnpm/prepare'
import { readPackageJsonFromDir } from '@pnpm/read-package-json'
import { readProjectManifest } from '@pnpm/read-project-manifest'
import { getIntegrity } from '@pnpm/registry-mock'
import { getIndexFilePathInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { lexCompare } from '@pnpm/util.lex-comparator'
import { writeProjectManifest } from '@pnpm/write-project-manifest'
import { fixtures } from '@pnpm/test-fixtures'
@@ -25,6 +25,11 @@ import {
const skipOnWindows = isWindows() ? test.skip : test
const f = fixtures(import.meta.dirname)
const storeIndexes: StoreIndex[] = []
afterAll(() => {
for (const si of storeIndexes) si.close()
})
test('bin files are found by lifecycle scripts', () => {
prepare({
dependencies: {
@@ -159,12 +164,19 @@ test("don't fail on case insensitive filesystems when package has 2 files with s
project.has('@pnpm.e2e/with-same-file-in-different-cases')
const { files: integrityFile } = readMsgpackFileSync<PackageFilesIndex>(project.getPkgIndexFilePath('@pnpm.e2e/with-same-file-in-different-cases', '1.0.0'))
const packageFiles = Array.from(integrityFile.keys()).sort(lexCompare)
const storeDir = project.getStorePath()
const indexKey = storeIndexKey(getIntegrity('@pnpm.e2e/with-same-file-in-different-cases', '1.0.0'), '@pnpm.e2e/with-same-file-in-different-cases@1.0.0')
const si = new StoreIndex(storeDir)
let filesIndex: PackageFilesIndex
try {
filesIndex = si.get(indexKey) as PackageFilesIndex
} finally {
si.close()
}
const packageFiles = Array.from(filesIndex.files.keys()).sort(lexCompare)
expect(packageFiles).toStrictEqual(['Foo.js', 'foo.js', 'package.json'])
const files = fs.readdirSync('node_modules/@pnpm.e2e/with-same-file-in-different-cases')
const storeDir = project.getStorePath()
if (await dirIsCaseSensitive.default(storeDir)) {
expect([...files].sort(lexCompare)).toStrictEqual(['Foo.js', 'foo.js', 'package.json'])
} else {
@@ -505,9 +517,11 @@ test('installation fails when the stored package name and version do not match t
await execPnpm(['add', '@pnpm.e2e/dep-of-pkg-with-1-dep@100.1.0', ...settings])
const cacheIntegrityPath = getIndexFilePathInCafs(path.join(storeDir, STORE_VERSION), getIntegrity('@pnpm.e2e/dep-of-pkg-with-1-dep', '100.1.0'), '@pnpm.e2e/dep-of-pkg-with-1-dep@100.1.0')
const cacheIntegrity = readMsgpackFileSync<PackageFilesIndex>(cacheIntegrityPath)
writeMsgpackFileSync(cacheIntegrityPath, {
const cacheIntegrityKey = storeIndexKey(getIntegrity('@pnpm.e2e/dep-of-pkg-with-1-dep', '100.1.0'), '@pnpm.e2e/dep-of-pkg-with-1-dep@100.1.0')
const storeIndex = new StoreIndex(path.join(storeDir, STORE_VERSION))
storeIndexes.push(storeIndex)
const cacheIntegrity = storeIndex.get(cacheIntegrityKey) as PackageFilesIndex
storeIndex.set(cacheIntegrityKey, {
...cacheIntegrity,
manifest: { ...cacheIntegrity.manifest, name: 'foo' },
})

View File

@@ -158,6 +158,9 @@
{
"path": "../store/cafs"
},
{
"path": "../store/index"
},
{
"path": "../store/plugin-commands-store"
},

View File

@@ -47,6 +47,7 @@
"@pnpm/resolver-base": "workspace:*",
"@pnpm/resolving.jsr-specifier-parser": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"@pnpm/workspace.spec-parser": "workspace:*",
"@zkochan/retry": "catalog:",

View File

@@ -19,7 +19,7 @@ import {
type WorkspacePackages,
type WorkspacePackagesByVersion,
} from '@pnpm/resolver-base'
import { getIndexFilePathInCafs } from '@pnpm/store.cafs'
import { storeIndexKey } from '@pnpm/store.index'
import {
readPkgFromCafs,
} from '@pnpm/worker'
@@ -184,7 +184,7 @@ export function createNpmResolver (
let peekManifestFromStore: ResolveFromNpmContext['peekManifestFromStore'] | undefined
if (storeDir) {
peekManifestFromStore = async (peekOpts) => {
const filesIndexFile = getIndexFilePathInCafs(storeDir, peekOpts.integrity, peekOpts.id)
const filesIndexFile = storeIndexKey(peekOpts.integrity, peekOpts.id)
const existingRequest = peekLockerForPeek.get(filesIndexFile)
if (existingRequest != null) {
return existingRequest

View File

@@ -57,6 +57,9 @@
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
},
{
"path": "../../worker"
},

View File

@@ -35,7 +35,6 @@
},
"dependencies": {
"@pnpm/dependency-path": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/lockfile.detect-dep-types": "workspace:*",
"@pnpm/lockfile.fs": "workspace:*",
"@pnpm/lockfile.utils": "workspace:*",
@@ -46,6 +45,7 @@
"@pnpm/read-modules-dir": "workspace:*",
"@pnpm/read-package-json": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"@pnpm/util.lex-comparator": "catalog:",
"load-json-file": "catalog:",

View File

@@ -9,6 +9,7 @@ import {
} from '@pnpm/lockfile.fs'
import { detectDepTypes } from '@pnpm/lockfile.detect-dep-types'
import { readModulesManifest } from '@pnpm/modules-yaml'
import { StoreIndex } from '@pnpm/store.index'
import { normalizeRegistries } from '@pnpm/normalize-registries'
import { readModulesDir } from '@pnpm/read-modules-dir'
import { safeReadPackageJsonFromDir } from '@pnpm/read-package-json'
@@ -72,6 +73,8 @@ export async function buildDependenciesTree (
return result
}
const storeDir = modules?.storeDir
const storeIndex = storeDir ? new StoreIndex(storeDir) : undefined
const opts = {
depth: maybeOpts.depth || 0,
excludePeerDependencies: maybeOpts.excludePeerDependencies,
@@ -87,7 +90,8 @@ export async function buildDependenciesTree (
search: maybeOpts.search,
showDedupedSearchMatches: maybeOpts.showDedupedSearchMatches ?? (maybeOpts.search != null),
skipped: new Set(modules?.skipped ?? []),
storeDir: modules?.storeDir,
storeDir,
storeIndex,
modulesDir,
virtualStoreDir: modules?.virtualStoreDir,
virtualStoreDirMaxLength: modules?.virtualStoreDirMaxLength ?? maybeOpts.virtualStoreDirMaxLength,
@@ -130,6 +134,7 @@ export async function buildDependenciesTree (
for (const [projectPath, dependenciesHierarchy] of pairs) {
result[projectPath] = dependenciesHierarchy
}
storeIndex?.close()
return result
}

View File

@@ -7,6 +7,7 @@ import {
} from '@pnpm/lockfile.fs'
import { nameVerFromPkgSnapshot } from '@pnpm/lockfile.utils'
import { readModulesManifest } from '@pnpm/modules-yaml'
import { StoreIndex } from '@pnpm/store.index'
import { normalizeRegistries } from '@pnpm/normalize-registries'
import { type DependenciesField, type DependencyManifest, type Finder, type Registries } from '@pnpm/types'
import { lexCompare } from '@pnpm/util.lex-comparator'
@@ -90,6 +91,7 @@ export async function buildDependentsTree (
...modules?.registries,
})
const storeDir = modules?.storeDir
const storeIndex = storeDir ? new StoreIndex(storeDir) : undefined
const virtualStoreDir = modules?.virtualStoreDir ?? path.join(modulesDir, '.pnpm')
const virtualStoreDirMaxLength = modules?.virtualStoreDirMaxLength ?? 120
@@ -129,6 +131,7 @@ export async function buildDependentsTree (
registries,
wantedPackages: currentPackages,
storeDir,
storeIndex,
})
// Scan all package nodes for matches.
@@ -208,6 +211,7 @@ export async function buildDependentsTree (
if (versionCmp !== 0) return versionCmp
return lexCompare(a.peersSuffixHash ?? '', b.peersSuffixHash ?? '')
})
storeIndex?.close()
return trees
}
@@ -248,6 +252,7 @@ function resolvePackageNodes (
registries: Registries
wantedPackages: PackageSnapshots
storeDir?: string
storeIndex?: StoreIndex
}
): Map<string, { path: string, readManifest: () => DependencyManifest }> {
const resolved = new Map<string, { path: string, readManifest: () => DependencyManifest }>()

View File

@@ -9,6 +9,7 @@ import {
pkgSnapshotToResolution,
} from '@pnpm/lockfile.utils'
import { type DepTypes, DepType } from '@pnpm/lockfile.detect-dep-types'
import { type StoreIndex } from '@pnpm/store.index'
import { type DependencyManifest, type Registries } from '@pnpm/types'
import { refToRelative } from '@pnpm/dependency-path'
import { readPackageJsonFromDirSync } from '@pnpm/read-package-json'
@@ -24,6 +25,7 @@ export interface GetPkgInfoOpts {
readonly registries: Registries
readonly skipped: Set<string>
readonly storeDir?: string
readonly storeIndex?: StoreIndex
readonly wantedPackages: PackageSnapshots
readonly virtualStoreDir?: string
readonly virtualStoreDirMaxLength: number
@@ -141,8 +143,8 @@ export function getPkgInfo (opts: GetPkgInfoOpts): { pkgInfo: PackageInfo, readM
return {
pkgInfo: packageInfo,
readManifest: () => {
if (integrity && opts.storeDir) {
const manifest = readManifestFromCafs(opts.storeDir, { integrity, name, version })
if (integrity && opts.storeDir && opts.storeIndex) {
const manifest = readManifestFromCafs(opts.storeDir, opts.storeIndex, { integrity, name, version })
if (manifest) return manifest
}
return readPackageJsonFromDirSync(fullPackagePath)

View File

@@ -1,6 +1,7 @@
import path from 'path'
import { type PackageSnapshots, type ProjectSnapshot } from '@pnpm/lockfile.fs'
import { type DepTypes } from '@pnpm/lockfile.detect-dep-types'
import { type StoreIndex } from '@pnpm/store.index'
import { type Finder, type Registries } from '@pnpm/types'
import { lexCompare } from '@pnpm/util.lex-comparator'
import { type DependencyGraph } from './buildDependencyGraph.js'
@@ -23,6 +24,7 @@ export interface BaseTreeOpts {
registries: Registries
depTypes: DepTypes
storeDir?: string
storeIndex?: StoreIndex
virtualStoreDir?: string
virtualStoreDirMaxLength: number
modulesDir?: string

View File

@@ -1,21 +1,22 @@
import { readMsgpackFileSync } from '@pnpm/fs.msgpack-file'
import { type StoreIndex, storeIndexKey } from '@pnpm/store.index'
import { loadJsonFileSync } from 'load-json-file'
import { getIndexFilePathInCafs, getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { getFilePathByModeInCafs, type PackageFilesIndex } from '@pnpm/store.cafs'
import { type DependencyManifest } from '@pnpm/types'
/**
* Attempts to read a package manifest from the content-addressable store (CAFS)
* using its integrity hash. Returns `undefined` if the manifest cannot be read.
*/
export function readManifestFromCafs (storeDir: string, pkg: {
export function readManifestFromCafs (storeDir: string, storeIndex: StoreIndex, pkg: {
integrity: string
name: string
version: string
}): DependencyManifest | undefined {
try {
const pkgId = `${pkg.name}@${pkg.version}`
const indexPath = getIndexFilePathInCafs(storeDir, pkg.integrity, pkgId)
const pkgIndex = readMsgpackFileSync<PackageFilesIndex>(indexPath)
const indexPath = storeIndexKey(pkg.integrity, pkgId)
const pkgIndex = storeIndex.get(indexPath) as PackageFilesIndex | undefined
if (!pkgIndex) return undefined
const pkgJsonEntry = pkgIndex.files.get('package.json')
if (pkgJsonEntry) {
const filePath = getFilePathByModeInCafs(storeDir, pkgJsonEntry.digest, pkgJsonEntry.mode)

View File

@@ -18,9 +18,6 @@
{
"path": "../../config/normalize-registries"
},
{
"path": "../../fs/msgpack-file"
},
{
"path": "../../fs/read-modules-dir"
},
@@ -50,6 +47,9 @@
},
{
"path": "../../store/cafs"
},
{
"path": "../../store/index"
}
]
}

View File

@@ -41,6 +41,7 @@
"@pnpm/lockfile.walker": "workspace:*",
"@pnpm/package-is-installable": "workspace:*",
"@pnpm/read-package-json": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/store.pkg-finder": "workspace:*",
"@pnpm/types": "workspace:*",
"p-limit": "catalog:",

View File

@@ -5,6 +5,7 @@ import { readPackageJson } from '@pnpm/read-package-json'
import { depPathToFilename } from '@pnpm/dependency-path'
import pLimit from 'p-limit'
import { type PackageManifest, type Registries } from '@pnpm/types'
import { type StoreIndex } from '@pnpm/store.index'
import { readPackageFileMap } from '@pnpm/store.pkg-finder'
import { PnpmError } from '@pnpm/error'
import type { LicensePackage } from './licenses.js'
@@ -195,6 +196,7 @@ export interface PackageInfo {
export interface GetPackageInfoOptions {
storeDir: string
storeIndex: StoreIndex
virtualStoreDir: string
virtualStoreDirMaxLength: number
dir: string
@@ -229,6 +231,7 @@ export async function getPkgInfo (
pkg.id,
{
storeDir: opts.storeDir,
storeIndex: opts.storeIndex,
lockfileDir: opts.dir,
virtualStoreDirMaxLength: opts.virtualStoreDirMaxLength,
}

View File

@@ -1,11 +1,12 @@
import { type LockfileObject, type TarballResolution } from '@pnpm/lockfile.types'
import { nameVerFromPkgSnapshot } from '@pnpm/lockfile.utils'
import { nameVerFromPkgSnapshot, packageIdFromSnapshot } from '@pnpm/lockfile.utils'
import { packageIsInstallable } from '@pnpm/package-is-installable'
import {
lockfileWalkerGroupImporterSteps,
type LockfileWalkerStep,
} from '@pnpm/lockfile.walker'
import { type DepTypes, DepType, detectDepTypes } from '@pnpm/lockfile.detect-dep-types'
import { StoreIndex } from '@pnpm/store.index'
import { type SupportedArchitectures, type DependenciesField, type ProjectId, type Registries } from '@pnpm/types'
import { map as mapValues } from 'ramda'
import { getPkgInfo } from './getPkgInfo.js'
@@ -33,6 +34,7 @@ export type LicenseNodeTree = Omit<
export interface LicenseExtractOptions {
storeDir: string
storeIndex: StoreIndex
virtualStoreDir: string
virtualStoreDirMaxLength: number
modulesDir?: string
@@ -71,7 +73,7 @@ export async function lockfileToLicenseNode (
const packageInfo = await getPkgInfo(
{
id: pkgSnapshot.id ?? depPath,
id: packageIdFromSnapshot(depPath, pkgSnapshot),
name,
version,
depPath,
@@ -80,6 +82,7 @@ export async function lockfileToLicenseNode (
},
{
storeDir: options.storeDir,
storeIndex: options.storeIndex,
virtualStoreDir: options.virtualStoreDir,
virtualStoreDirMaxLength: options.virtualStoreDirMaxLength,
dir: options.dir,
@@ -128,7 +131,7 @@ export async function lockfileToLicenseNodeTree (
opts: {
include?: { [dependenciesField in DependenciesField]: boolean }
includedImporterIds?: ProjectId[]
} & LicenseExtractOptions
} & Omit<LicenseExtractOptions, 'storeIndex'>
): Promise<LicenseNodeTree> {
const importerWalkers = lockfileWalkerGroupImporterSteps(
lockfile,
@@ -136,11 +139,13 @@ export async function lockfileToLicenseNodeTree (
{ include: opts?.include }
)
const depTypes = detectDepTypes(lockfile)
const storeIndex = new StoreIndex(opts.storeDir)
const dependencies = Object.fromEntries(
await Promise.all(
importerWalkers.map(async (importerWalker) => {
const importerDeps = await lockfileToLicenseNode(importerWalker.step, {
storeDir: opts.storeDir,
storeIndex,
virtualStoreDir: opts.virtualStoreDir,
virtualStoreDirMaxLength: opts.virtualStoreDirMaxLength,
modulesDir: opts.modulesDir,
@@ -158,6 +163,7 @@ export async function lockfileToLicenseNodeTree (
})
)
)
storeIndex.close()
const licenseNodeTree: LicenseNodeTree = {
name: undefined,

View File

@@ -1,3 +1,7 @@
import fs from 'fs'
import os from 'os'
import path from 'path'
import { StoreIndex } from '@pnpm/store.index'
import { getPkgInfo } from '../lib/getPkgInfo.js'
export const DEFAULT_REGISTRIES = {
@@ -6,6 +10,19 @@ export const DEFAULT_REGISTRIES = {
}
describe('licences', () => {
let storeDir: string
let storeIndex: StoreIndex
beforeAll(() => {
storeDir = fs.mkdtempSync(path.join(os.tmpdir(), 'pnpm-license-test-'))
storeIndex = new StoreIndex(storeDir)
})
afterAll(() => {
storeIndex.close()
fs.rmSync(storeDir, { recursive: true, force: true })
})
test('getPkgInfo() should throw error when package info can not be fetched', async () => {
await expect(
getPkgInfo(
@@ -22,13 +39,14 @@ describe('licences', () => {
registries: DEFAULT_REGISTRIES,
},
{
storeDir: 'store-dir',
storeDir,
storeIndex,
virtualStoreDir: 'virtual-store-dir',
modulesDir: 'modules-dir',
dir: 'workspace-dir',
virtualStoreDirMaxLength: 120,
}
)
).rejects.toThrow(/Failed to find package index file for bogus-package@1\.0\.0 \(at .*16-bogus-package@1\.0\.0\.mpk\), please consider running 'pnpm install'/)
).rejects.toThrow(/Failed to find package index file for bogus-package@1\.0\.0 \(at .*\), please consider running 'pnpm install'/)
})
})

View File

@@ -1,3 +1,6 @@
import fs from 'fs'
import os from 'os'
import path from 'path'
import { LOCKFILE_VERSION } from '@pnpm/constants'
import type { DepPath, ProjectManifest, Registries, ProjectId } from '@pnpm/types'
import type { LockfileObject } from '@pnpm/lockfile.fs'
@@ -5,6 +8,11 @@ import { jest } from '@jest/globals'
import type { LicensePackage } from '../lib/licenses.js'
import type { GetPackageInfoOptions, PackageInfo } from '../lib/getPkgInfo.js'
const tmpStoreDir = fs.mkdtempSync(path.join(os.tmpdir(), 'pnpm-license-spec-'))
afterAll(() => {
fs.rmSync(tmpStoreDir, { recursive: true, force: true })
})
const actualModule = await import('../lib/getPkgInfo.js')
jest.unstable_mockModule('../lib/getPkgInfo.js', () => {
return {
@@ -72,7 +80,7 @@ describe('licences', () => {
virtualStoreDir: '/.pnpm',
registries: {} as Registries,
wantedLockfile: lockfile,
storeDir: '/opt/.pnpm',
storeDir: tmpStoreDir,
virtualStoreDirMaxLength: 120,
})
@@ -158,7 +166,7 @@ describe('licences', () => {
virtualStoreDir: '/.pnpm',
registries: {} as Registries,
wantedLockfile: lockfile,
storeDir: '/opt/.pnpm',
storeDir: tmpStoreDir,
includedImporterIds: ['packages/a'] as ProjectId[],
virtualStoreDirMaxLength: 120,
})
@@ -235,7 +243,7 @@ describe('licences', () => {
virtualStoreDir: '/.pnpm',
registries: {} as Registries,
wantedLockfile: lockfile,
storeDir: '/opt/.pnpm',
storeDir: tmpStoreDir,
virtualStoreDirMaxLength: 120,
})

View File

@@ -45,6 +45,9 @@
{
"path": "../../pkg-manifest/read-package-json"
},
{
"path": "../../store/index"
},
{
"path": "../../store/pkg-finder"
}

View File

@@ -14,7 +14,7 @@ interface GetManifestOpts {
minimumReleaseAgeExclude?: string[]
}
export type ManifestGetterOptions = Omit<ClientOptions, 'authConfig' | 'minimumReleaseAgeExclude'>
export type ManifestGetterOptions = Omit<ClientOptions, 'authConfig' | 'minimumReleaseAgeExclude' | 'storeIndex'>
& GetManifestOpts
& { fullMetadata: boolean, rawConfig: Record<string, string> }

View File

@@ -40,6 +40,7 @@
"@pnpm/lockfile.utils": "workspace:*",
"@pnpm/lockfile.walker": "workspace:*",
"@pnpm/read-package-json": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/store.pkg-finder": "workspace:*",
"@pnpm/types": "workspace:*",
"p-limit": "catalog:",

View File

@@ -6,6 +6,7 @@ import {
} from '@pnpm/lockfile.walker'
import { type DepTypes, DepType, detectDepTypes } from '@pnpm/lockfile.detect-dep-types'
import { type DependenciesField, type ProjectId, type Registries } from '@pnpm/types'
import { StoreIndex } from '@pnpm/store.index'
import { buildPurl, encodePurlName } from './purl.js'
import { getPkgMetadata, type GetPkgMetadataOptions } from './getPkgMetadata.js'
import { type SbomComponent, type SbomRelationship, type SbomResult, type SbomComponentType } from './types.js'
@@ -42,9 +43,13 @@ export async function collectSbomComponents (opts: CollectSbomComponentsOptions)
const relationships: SbomRelationship[] = []
const rootPurl = `pkg:npm/${encodePurlName(opts.rootName)}@${opts.rootVersion}`
const metadataOpts: GetPkgMetadataOptions | undefined = (!opts.lockfileOnly && opts.storeDir)
const storeIndex = (!opts.lockfileOnly && opts.storeDir)
? new StoreIndex(opts.storeDir)
: undefined
const metadataOpts: GetPkgMetadataOptions | undefined = (storeIndex && opts.storeDir)
? {
storeDir: opts.storeDir,
storeIndex,
lockfileDir: opts.lockfileDir,
virtualStoreDirMaxLength: opts.virtualStoreDirMaxLength ?? 120,
}
@@ -63,6 +68,7 @@ export async function collectSbomComponents (opts: CollectSbomComponentsOptions)
)
})
)
storeIndex?.close()
return {
rootComponent: {

View File

@@ -1,4 +1,5 @@
import { type PackageManifest, type Registries } from '@pnpm/types'
import { type StoreIndex } from '@pnpm/store.index'
import { readPackageFileMap } from '@pnpm/store.pkg-finder'
import { readPackageJson } from '@pnpm/read-package-json'
import { type PackageSnapshot, pkgSnapshotToResolution } from '@pnpm/lockfile.utils'
@@ -16,6 +17,7 @@ export interface PkgMetadata {
export interface GetPkgMetadataOptions {
storeDir: string
storeIndex: StoreIndex
lockfileDir: string
virtualStoreDirMaxLength: number
}

View File

@@ -30,6 +30,9 @@
{
"path": "../../pkg-manifest/read-package-json"
},
{
"path": "../../store/index"
},
{
"path": "../../store/pkg-finder"
}

View File

@@ -71,7 +71,6 @@ export interface Cafs {
addFilesFromDir: (dir: string) => AddToStoreResult
addFilesFromTarball: (buffer: Buffer) => AddToStoreResult
addFile: (buffer: Buffer, mode: number) => FileWriteResult
getIndexFilePathInCafs: (integrity: string, pkgId: string) => string
getFilePathByModeInCafs: (digest: string, mode: number) => string
importPackage: ImportPackageFunction
tempDir: () => Promise<string>

View File

@@ -31,7 +31,6 @@
"prepublishOnly": "pnpm run compile"
},
"dependencies": {
"@pnpm/crypto.integrity": "workspace:*",
"@pnpm/error": "workspace:*",
"@pnpm/fetcher-base": "workspace:*",
"@pnpm/graceful-fs": "workspace:*",

View File

@@ -1,5 +1,4 @@
import path from 'path'
import { parseIntegrity } from '@pnpm/crypto.integrity'
/**
* Checks if a file mode has any executable permissions set.
@@ -27,22 +26,6 @@ export function getFilePathByModeInCafs (
return path.join(storeDir, contentPathFromHex(fileType, hexDigest))
}
export function getIndexFilePathInCafs (
storeDir: string,
integrity: string,
pkgId: string
): string {
const { hexDigest } = parseIntegrity(integrity)
const hex = hexDigest.substring(0, 64)
// Some registries allow identical content to be published under different package names or versions.
// To accommodate this, index files are stored using both the content hash and package identifier.
// This approach ensures that we can:
// 1. Validate that the integrity in the lockfile corresponds to the correct package,
// which might not be the case after a poorly resolved Git conflict.
// 2. Allow the same content to be referenced by different packages or different versions of the same package.
return path.join(storeDir, `index/${path.join(hex.slice(0, 2), hex.slice(2))}-${pkgId.replace(/[\\/:*?"<>|]/g, '+')}.mpk`)
}
export function contentPathFromHex (fileType: FileType, hex: string): string {
const p = path.join('files', hex.slice(0, 2), hex.slice(2))
switch (fileType) {

View File

@@ -18,7 +18,6 @@ import {
type VerifyResult,
} from './checkPkgFilesIntegrity.js'
import {
getIndexFilePathInCafs,
contentPathFromHex,
type FileType,
getFilePathByModeInCafs,
@@ -37,7 +36,6 @@ export {
buildFileMapsFromIndex,
type FileType,
getFilePathByModeInCafs,
getIndexFilePathInCafs,
type Integrity,
type PackageFileInfo,
type PackageFiles,
@@ -60,7 +58,6 @@ export interface CafsFunctions {
addFilesFromDir: (dirname: string, opts?: { files?: string[], readManifest?: boolean, includeNodeModules?: boolean }) => AddToStoreResult
addFilesFromTarball: (tarballBuffer: Buffer, readManifest?: boolean) => AddToStoreResult
addFile: (buffer: Buffer, mode: number) => FileWriteResult
getIndexFilePathInCafs: (integrity: string, pkgId: string) => string
getFilePathByModeInCafs: (digest: string, mode: number) => string
}
@@ -71,7 +68,6 @@ export function createCafs (storeDir: string, { ignoreFile, cafsLocker }: Create
addFilesFromDir: addFilesFromDir.bind(null, addBuffer),
addFilesFromTarball: addFilesFromTarball.bind(null, addBuffer, ignoreFile ?? null),
addFile: addBuffer,
getIndexFilePathInCafs: getIndexFilePathInCafs.bind(null, storeDir),
getFilePathByModeInCafs: getFilePathByModeInCafs.bind(null, storeDir),
}
}

View File

@@ -12,9 +12,6 @@
{
"path": "../../__utils__/test-fixtures"
},
{
"path": "../../crypto/integrity"
},
{
"path": "../../fetching/fetcher-base"
},

29
store/index/README.md Normal file
View File

@@ -0,0 +1,29 @@
# @pnpm/store.index
> SQLite-backed index for the pnpm content-addressable store
## Why SQLite instead of individual index files?
Previously, pnpm stored package metadata as individual JSON files under
`$STORE/index/`. Each resolved package had its own file, keyed by its integrity
hash. This worked but had several downsides at scale:
- **Filesystem overhead.** Every lookup required `open` / `read` / `close`
syscalls, and every write needed an atomic `write` + `rename` per entry.
On repositories with thousands of dependencies the accumulated I/O was
significant.
- **Space inefficiency.** Small metadata entries still consumed a minimum
filesystem block each (typically 4 KiB), wasting space.
Storing all entries in a single SQLite database (`$STORE/index.db`) addresses
these issues:
- **Fewer syscalls.** Reads and writes go through SQLite's page cache and
memory-mapped I/O instead of individual file operations.
- **Space efficiency.** Small entries share database pages instead of each
occupying a full filesystem block.
- **Batch writes.** Multiple entries can be inserted in a single transaction,
reducing disk flushes.
## License
MIT

48
store/index/package.json Normal file
View File

@@ -0,0 +1,48 @@
{
"name": "@pnpm/store.index",
"version": "1000.0.0-0",
"description": "SQLite-backed index for the pnpm content-addressable store",
"keywords": [
"pnpm",
"pnpm11",
"store"
],
"license": "MIT",
"funding": "https://opencollective.com/pnpm",
"repository": "https://github.com/pnpm/pnpm/tree/main/store/index",
"homepage": "https://github.com/pnpm/pnpm/tree/main/store/index#readme",
"bugs": {
"url": "https://github.com/pnpm/pnpm/issues"
},
"type": "module",
"main": "lib/index.js",
"types": "lib/index.d.ts",
"exports": {
".": "./lib/index.js"
},
"files": [
"lib",
"!*.map"
],
"scripts": {
"lint": "eslint \"src/**/*.ts\" \"test/**/*.ts\"",
"_test": "cross-env NODE_OPTIONS=\"$NODE_OPTIONS --experimental-vm-modules\" jest",
"test": "pnpm run compile && pnpm run _test",
"compile": "tsgo --build && pnpm run lint --fix",
"prepublishOnly": "pnpm run compile"
},
"dependencies": {
"msgpackr": "catalog:"
},
"devDependencies": {
"@pnpm/store.index": "workspace:*",
"@types/node": "catalog:",
"tempy": "catalog:"
},
"engines": {
"node": ">=22.13"
},
"jest": {
"preset": "@pnpm/jest-config"
}
}

267
store/index/src/index.ts Normal file
View File

@@ -0,0 +1,267 @@
import { createRequire } from 'module'
import fs from 'fs'
import type { DatabaseSync as DatabaseSyncType, StatementSync } from 'node:sqlite'
import { Packr } from 'msgpackr'
// Use createRequire to load node:sqlite because it is a prefix-only builtin
// that Jest's ESM module resolver cannot handle.
const req = createRequire(import.meta.url)
const { DatabaseSync } = req('node:sqlite') as { DatabaseSync: typeof DatabaseSyncType }
const packr = new Packr({
useRecords: true,
moreTypes: true,
})
const SQLITE_BUSY = 5
const RETRY_DELAY_MS = 50
const MAX_RETRIES = 100 // ~5 seconds total
function sqliteRetry<T> (fn: () => T): T {
for (let attempt = 0; ; attempt++) {
try {
return fn()
} catch (err: unknown) {
if (isSqliteBusy(err) && attempt < MAX_RETRIES) {
sleepSync(RETRY_DELAY_MS)
continue
}
throw err
}
}
}
function isSqliteBusy (err: any): boolean { // eslint-disable-line @typescript-eslint/no-explicit-any
// errcode may be an extended error code (e.g. SQLITE_BUSY_RECOVERY = 261),
// so mask off the upper bits to get the primary error code.
return (err?.errcode & 0xFF) === SQLITE_BUSY
}
const sleepBuffer = new Int32Array(new SharedArrayBuffer(4))
function sleepSync (ms: number): void {
Atomics.wait(sleepBuffer, 0, 0, ms)
}
/**
* Pack data for storage using msgpackr.
* Use this when data will be packed in one thread and stored by another,
* to ensure the same Packr instance is used for pack and unpack within each thread.
*/
export function packForStorage (data: unknown): Uint8Array {
return packr.pack(data)
}
/**
* Create a store index key from an integrity hash and package id.
* The key is `${integrity}\t${pkgId}` — tab-separated.
* Integrity strings never contain tabs, so this is unambiguous.
*/
export function storeIndexKey (integrity: string, pkgId: string): string {
return `${integrity}\t${pkgId}`
}
export function gitHostedStoreIndexKey (pkgId: string, opts: { built: boolean }): string {
return storeIndexKey(pkgId, opts.built ? 'built' : 'not-built')
}
const openInstances = new Set<StoreIndex>()
/**
* Close all open StoreIndex instances.
* Useful in tests that need to remove the store directory.
*/
export function closeAllStoreIndexes (): void {
for (const si of openInstances) {
si.close()
}
}
export class StoreIndex {
private db: DatabaseSyncType
private closed = false
private pendingWrites: Array<{ key: string, buffer: Uint8Array }> = []
private flushScheduled = false
private stmtGet: StatementSync
private stmtSet: StatementSync
private stmtDel: StatementSync
private stmtHas: StatementSync
private stmtAll: StatementSync
private readonly exitHandler: () => void
constructor (storeDir: string) {
const dbPath = `${storeDir}/index.db`
fs.mkdirSync(storeDir, { recursive: true })
this.db = new DatabaseSync(dbPath)
// Set busy_timeout FIRST so SQLite's internal busy handler is active
// during all subsequent operations. On Windows, file locking is mandatory
// and concurrent processes (e.g. parallel dlx calls) will contend.
this.db.exec('PRAGMA busy_timeout=5000')
sqliteRetry(() => {
this.db.exec('PRAGMA journal_mode=WAL')
this.db.exec('PRAGMA synchronous=NORMAL')
// Increase memory map size to 512MB
this.db.exec('PRAGMA mmap_size=536870912')
// Increase page cache size to ~32MB
this.db.exec('PRAGMA cache_size=-32000')
this.db.exec('PRAGMA temp_store=MEMORY')
// Increase wal autocheckpoint interval to reduce I/O during heavy writes
this.db.exec('PRAGMA wal_autocheckpoint=10000')
this.db.exec(`
CREATE TABLE IF NOT EXISTS package_index (
key TEXT PRIMARY KEY,
data BLOB NOT NULL
) WITHOUT ROWID
`)
})
this.stmtGet = this.db.prepare('SELECT data FROM package_index WHERE key = ?')
this.stmtSet = this.db.prepare('INSERT OR REPLACE INTO package_index (key, data) VALUES (?, ?)')
this.stmtDel = this.db.prepare('DELETE FROM package_index WHERE key = ?')
this.stmtHas = this.db.prepare('SELECT 1 FROM package_index WHERE key = ?')
this.stmtAll = this.db.prepare('SELECT key, data FROM package_index')
this.exitHandler = () => this.close()
process.on('exit', this.exitHandler)
openInstances.add(this)
}
get (key: string): unknown | undefined {
const row = sqliteRetry(() => this.stmtGet.get(key)) as { data: Uint8Array } | undefined
if (row) {
return packr.unpack(row.data)
}
return undefined
}
set (key: string, data: unknown): void {
const buffer = packr.pack(data)
sqliteRetry(() => {
this.stmtSet.run(key, buffer)
})
}
delete (key: string): boolean {
let result!: { changes: number | bigint }
sqliteRetry(() => {
result = this.stmtDel.run(key)
})
return result.changes > 0
}
has (key: string): boolean {
return sqliteRetry(() => this.stmtHas.get(key)) != null
}
/**
* Iterate over all index entries.
* Yields [key, data] pairs where key is `integrity\tpkgId`.
*/
* entries (): IterableIterator<[string, unknown]> {
for (const row of this.stmtAll.iterate() as IterableIterator<{ key: string, data: Uint8Array }>) {
yield [row.key, packr.unpack(row.data)]
}
}
/**
* Queue pre-packed writes to be flushed on the next tick.
* Used by the fetch phase for throughput.
*/
queueWrites (writes: Array<{ key: string, buffer: Uint8Array }>): void {
for (const w of writes) {
this.pendingWrites.push(w)
}
if (!this.flushScheduled) {
this.flushScheduled = true
process.nextTick(() => this.flush())
}
}
/**
* Flush all pending queued writes immediately.
*/
flush (): void {
this.flushScheduled = false
if (this.pendingWrites.length === 0) return
this.setRawMany(this.pendingWrites)
this.pendingWrites = []
}
/**
* Write multiple pre-packed entries in a single transaction.
* The buffers must already be msgpack-encoded.
*/
setRawMany (entries: Array<{ key: string, buffer: Uint8Array }>): void {
if (this.closed || entries.length === 0) return
if (entries.length === 1) {
sqliteRetry(() => {
this.stmtSet.run(entries[0].key, entries[0].buffer)
})
return
}
sqliteRetry(() => {
this.db.exec('BEGIN IMMEDIATE')
let committed = false
try {
for (const { key, buffer } of entries) {
this.stmtSet.run(key, buffer)
}
this.db.exec('COMMIT')
committed = true
} finally {
if (!committed) {
try {
this.db.exec('ROLLBACK')
} catch {}
}
}
})
}
/**
* Delete multiple index entries in a single transaction,
* then VACUUM to reclaim disk space.
*/
deleteMany (keys: string[]): void {
if (keys.length === 0) return
if (keys.length === 1) {
this.delete(keys[0])
this.db.exec('VACUUM')
return
}
sqliteRetry(() => {
this.db.exec('BEGIN IMMEDIATE')
let committed = false
try {
for (const key of keys) {
this.stmtDel.run(key)
}
this.db.exec('COMMIT')
committed = true
} finally {
if (!committed) {
try {
this.db.exec('ROLLBACK')
} catch {}
}
}
})
this.db.exec('VACUUM')
}
close (): void {
if (this.closed) return
this.flush()
this.closed = true
openInstances.delete(this)
process.removeListener('exit', this.exitHandler)
try {
this.db.exec('PRAGMA optimize')
} catch {
// PRAGMA optimize is a performance hint; safe to ignore if the DB is locked.
}
try {
this.db.close()
} catch {
// The DB may be locked by another connection; the OS will reclaim it on process exit.
}
}
}

46
store/index/test/index.ts Normal file
View File

@@ -0,0 +1,46 @@
import { StoreIndex, storeIndexKey } from '@pnpm/store.index'
import path from 'path'
import { temporaryDirectory } from 'tempy'
test('StoreIndex round-trips data via SQLite key', () => {
const storeDir = path.join(temporaryDirectory(), 'store', 'v11')
const idx = new StoreIndex(storeDir)
try {
const key = storeIndexKey('sha512-abc123', 'lodash@4.17.21')
expect(idx.get(key)).toBeUndefined()
const data = { algo: 'sha512', files: new Map([['index.js', { digest: 'abc', size: 100, mode: 0o644 }]]) }
idx.set(key, data)
const result = idx.get(key) as typeof data
expect(result).toBeDefined()
expect(result.algo).toBe('sha512')
expect(result.files.get('index.js')?.digest).toBe('abc')
expect(idx.has(key)).toBe(true)
expect(idx.delete(key)).toBe(true)
expect(idx.get(key)).toBeUndefined()
expect(idx.has(key)).toBe(false)
} finally {
idx.close()
}
})
test('StoreIndex entries() iterates all SQLite entries', () => {
const storeDir = path.join(temporaryDirectory(), 'store', 'v11')
const idx = new StoreIndex(storeDir)
try {
const key1 = storeIndexKey('sha512-aaa', 'pkg-a@1.0.0')
const key2 = storeIndexKey('sha512-bbb', 'pkg-b@2.0.0')
idx.set(key1, { a: 1 })
idx.set(key2, { b: 2 })
const entries = [...idx.entries()]
expect(entries).toHaveLength(2)
const keys = entries.map(([k]) => k)
expect(keys).toContain(key1)
expect(keys).toContain(key2)
} finally {
idx.close()
}
})

View File

@@ -0,0 +1,18 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"noEmit": false,
"outDir": "../node_modules/.test.lib",
"rootDir": "..",
"isolatedModules": true
},
"include": [
"**/*.ts",
"../../../__typings__/**/*.d.ts"
],
"references": [
{
"path": ".."
}
]
}

12
store/index/tsconfig.json Normal file
View File

@@ -0,0 +1,12 @@
{
"extends": "@pnpm/tsconfig",
"compilerOptions": {
"outDir": "lib",
"rootDir": "src"
},
"include": [
"src/**/*.ts",
"../../__typings__/**/*.d.ts"
],
"references": []
}

View File

@@ -0,0 +1,8 @@
{
"extends": "./tsconfig.json",
"include": [
"src/**/*.ts",
"test/**/*.ts",
"../../__typings__/**/*.d.ts"
]
}

View File

@@ -48,12 +48,12 @@
"@pnpm/crypto.hash": "workspace:*",
"@pnpm/error": "workspace:*",
"@pnpm/fetcher-base": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/hooks.types": "workspace:*",
"@pnpm/package-requester": "workspace:*",
"@pnpm/resolver-base": "workspace:*",
"@pnpm/store-controller-types": "workspace:*",
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*",
"@pnpm/types": "workspace:*",
"@zkochan/rimraf": "catalog:",
"is-subdir": "catalog:",

View File

@@ -9,6 +9,7 @@ import {
type StoreController,
} from '@pnpm/store-controller-types'
import { type CustomFetcher } from '@pnpm/hooks.types'
import { type StoreIndex } from '@pnpm/store.index'
import { addFilesFromDir, importPackage, initStoreDir } from '@pnpm/worker'
import { prune } from './prune.js'
@@ -31,6 +32,7 @@ export interface CreatePackageStoreOptions {
strictStorePkgContentCheck?: boolean
clearResolutionCache: () => void
customFetchers?: CustomFetcher[]
storeIndex: StoreIndex
}
export function createPackageStore (
@@ -64,7 +66,9 @@ export function createPackageStore (
})
return {
close: async () => {}, // eslint-disable-line:no-empty
close: async () => {
initOpts.storeIndex.flush()
},
fetchPackage: packageRequester.fetchPackageToStore,
getFilesIndexFilePath: packageRequester.getFilesIndexFilePath,
importPackage: initOpts.importPackage
@@ -75,7 +79,7 @@ export function createPackageStore (
storeDir: initOpts.storeDir,
targetDir,
}),
prune: prune.bind(null, { storeDir, cacheDir: initOpts.cacheDir }),
prune: prune.bind(null, { storeDir, cacheDir: initOpts.cacheDir, storeIndex: initOpts.storeIndex }),
requestPackage: packageRequester.requestPackage,
upload,
clearResolutionCache: initOpts.clearResolutionCache,
@@ -84,6 +88,7 @@ export function createPackageStore (
async function upload (builtPkgLocation: string, opts: { filesIndexFile: string, sideEffectsCacheKey: string }): Promise<void> {
await addFilesFromDir({
storeDir: cafs.storeDir,
storeIndex: initOpts.storeIndex,
dir: builtPkgLocation,
sideEffectsCacheKey: opts.sideEffectsCacheKey,
filesIndexFile: opts.filesIndexFile,

View File

@@ -1,7 +1,7 @@
import { type Dirent, promises as fs } from 'fs'
import util from 'util'
import path from 'path'
import { readMsgpackFile } from '@pnpm/fs.msgpack-file'
import { type StoreIndex } from '@pnpm/store.index'
import { type PackageFilesIndex } from '@pnpm/store.cafs'
import { globalInfo, globalWarn } from '@pnpm/logger'
import rimraf from '@zkochan/rimraf'
@@ -12,9 +12,10 @@ const BIG_ONE = BigInt(1) as unknown
export interface PruneOptions {
cacheDir: string
storeDir: string
storeIndex: StoreIndex
}
export async function prune ({ cacheDir, storeDir }: PruneOptions, removeAlienFiles?: boolean): Promise<void> {
export async function prune ({ cacheDir, storeDir, storeIndex }: PruneOptions, removeAlienFiles?: boolean): Promise<void> {
// 1. First, prune the global virtual store
// This must happen BEFORE pruning the CAS, because removing packages from
// the virtual store will reduce hard link counts on files in the CAS
@@ -37,17 +38,6 @@ export async function prune ({ cacheDir, storeDir }: PruneOptions, removeAlienFi
// 3. Prune the content-addressable store (CAS)
const cafsDir = path.join(storeDir, 'files')
const pkgIndexFiles = [] as string[]
const indexDir = path.join(storeDir, 'index')
await Promise.all((await getSubdirsSafely(indexDir)).map(async (dir) => {
const subdir = path.join(indexDir, dir)
await Promise.all((await fs.readdir(subdir)).map(async (fileName) => {
const filePath = path.join(subdir, fileName)
if (fileName.endsWith('.mpk')) {
pkgIndexFiles.push(filePath)
}
}))
}))
const removedHashes = new Set<string>()
const dirs = await getSubdirsSafely(cafsDir)
let fileCounter = 0
@@ -55,10 +45,6 @@ export async function prune ({ cacheDir, storeDir }: PruneOptions, removeAlienFi
const subdir = path.join(cafsDir, dir)
await Promise.all((await fs.readdir(subdir)).map(async (fileName) => {
const filePath = path.join(subdir, fileName)
if (fileName.endsWith('.mpk')) {
pkgIndexFiles.push(filePath)
return
}
const stat = await fs.stat(filePath)
if (stat.isDirectory()) {
if (removeAlienFiles) {
@@ -82,17 +68,19 @@ export async function prune ({ cacheDir, storeDir }: PruneOptions, removeAlienFi
}))
globalInfo(`Removed ${fileCounter} file${fileCounter === 1 ? '' : 's'}`)
// 4. Clean up orphaned package index files
// 4. Clean up orphaned package index entries
let pkgCounter = 0
await Promise.all(pkgIndexFiles.map(async (pkgIndexFilePath) => {
const pkgFilesIndex = await readMsgpackFile<PackageFilesIndex>(pkgIndexFilePath)
const toDelete: string[] = []
for (const [filesIndexFile, data] of storeIndex.entries()) {
const pkgFilesIndex = data as PackageFilesIndex
const pkgJson = pkgFilesIndex.files.get('package.json')
// TODO: implement prune of Node.js packages, they don't have a package.json file
if (pkgJson && removedHashes.has(pkgJson.digest)) {
await fs.unlink(pkgIndexFilePath)
toDelete.push(filesIndexFile)
pkgCounter++
}
}))
}
storeIndex.deleteMany(toDelete)
globalInfo(`Removed ${pkgCounter} package${pkgCounter === 1 ? '' : 's'}`)
}

View File

@@ -3,6 +3,7 @@ import path from 'path'
import { createClient } from '@pnpm/client'
import { createPackageStore } from '@pnpm/package-store'
import { type FetchPackageToStoreFunction } from '@pnpm/store-controller-types'
import { StoreIndex } from '@pnpm/store.index'
import { temporaryDirectory } from 'tempy'
describe('store.importPackage()', () => {
@@ -12,11 +13,13 @@ describe('store.importPackage()', () => {
const cacheDir = path.join(tmp, 'cache')
const registry = 'https://registry.npmjs.org/'
const authConfig = { registry }
const storeIndex = new StoreIndex(storeDir)
const { resolve, fetchers, clearResolutionCache } = createClient({
authConfig,
cacheDir: path.join(tmp, 'cache'),
storeDir: path.join(tmp, 'store'),
rawConfig: {},
storeIndex,
registries: {
default: registry,
},
@@ -27,6 +30,7 @@ describe('store.importPackage()', () => {
verifyStoreIntegrity: true,
virtualStoreDirMaxLength: 120,
clearResolutionCache,
storeIndex,
})
const pkgId = 'registry.npmjs.org/is-positive/1.0.0'
const fetchResponse = (storeController.fetchPackage as FetchPackageToStoreFunction)({
@@ -55,11 +59,13 @@ describe('store.importPackage()', () => {
const cacheDir = path.join(tmp, 'cache')
const registry = 'https://registry.npmjs.org/'
const authConfig = { registry }
const storeIndex = new StoreIndex(storeDir)
const { resolve, fetchers, clearResolutionCache } = createClient({
authConfig,
cacheDir: path.join(tmp, 'cache'),
storeDir: path.join(tmp, 'store'),
rawConfig: {},
storeIndex,
registries: {
default: registry,
},
@@ -71,6 +77,7 @@ describe('store.importPackage()', () => {
verifyStoreIntegrity: true,
virtualStoreDirMaxLength: 120,
clearResolutionCache,
storeIndex,
})
const pkgId = 'registry.npmjs.org/is-positive/1.0.0'
const fetchResponse = (storeController.fetchPackage as FetchPackageToStoreFunction)({

View File

@@ -18,9 +18,6 @@
{
"path": "../../fetching/fetcher-base"
},
{
"path": "../../fs/msgpack-file"
},
{
"path": "../../hooks/types"
},
@@ -51,6 +48,9 @@
{
"path": "../create-cafs-store"
},
{
"path": "../index"
},
{
"path": "../store-controller-types"
}

View File

@@ -32,9 +32,9 @@
"dependencies": {
"@pnpm/dependency-path": "workspace:*",
"@pnpm/directory-fetcher": "workspace:*",
"@pnpm/fs.msgpack-file": "workspace:*",
"@pnpm/resolver-base": "workspace:*",
"@pnpm/store.cafs": "workspace:*"
"@pnpm/store.cafs": "workspace:*",
"@pnpm/store.index": "workspace:*"
},
"devDependencies": {
"@pnpm/store.pkg-finder": "workspace:*"

Some files were not shown because too many files have changed in this diff Show More