feat: add pnpm pack-app command for packing CJS entries into standalone executables (#11312)

* fix: give each runtime variant its own global virtual store entry

When a runtime package (e.g. node@runtime:X.Y.Z) uses a variations
resolution, createFullPkgId() in @pnpm/deps.graph-hasher was hashing
the whole VariationsResolution — the same hash on every host — so the
global virtual store path collided between variants. Whichever variant
installed first won, and a later `pnpm add --libc=musl node@runtime:<v>`
silently reused the cached glibc (or macOS/Windows) binary.

The fix threads supportedArchitectures down to createFullPkgId so the
selected variant's integrity is used as the package fingerprint. Two
related cleanups land with it:

- Extract the platform-variant selection logic to @pnpm/resolving.resolver-base
  as selectPlatformVariant/resolvePlatformSelector. The helper's libc
  match also required a fix: a variant with no libc is the "default"
  build, and a request for a non-default libc (e.g. musl) must require
  an exact match so the default variant doesn't silently win.
- @pnpm/installing.package-requester's findResolution now delegates to
  the shared helper, and the new supportedArchitectures param is plumbed
  through calcDepState / calcGraphNodeHash / iterateHashedGraphNodes /
  lockfileToDepGraph and their callers in deps-resolver, deps-restorer,
  deps-installer, graph-builder, and building.after-install.

* feat: add pnpm build-sea command for building Node.js SEA executables

Adds `pnpm build-sea` under @pnpm/releasing.commands. Takes a CommonJS
entry file and a set of target triplets (linux-x64, linux-x64-musl,
linux-arm64, linux-arm64-musl, macos-x64, macos-arm64, win-x64,
win-arm64) and produces a standalone executable per target under
dist-sea/<target>/.

Each target's Node.js runtime is fetched via `pnpm add node@runtime:<v>
--os=<os> --cpu=<arch> --libc=<libc>` into $PNPM_HOME/build-sea/<target>-<v>/
so binaries are hardlinked from the global content-addressable store and
`pnpm store prune` can reclaim them.

Requires Node.js v25.5+ to perform the --build-sea injection. If the
running Node is older, a v25 binary is downloaded and used as the builder
automatically. macOS outputs are ad-hoc signed with codesign (on macOS)
or ldid (when cross-compiling from Linux), which is required because SEA
injection invalidates the binary's existing signature.

* fix(build-sea): reject malformed --target, --output-name and use mkdtemp for config

Addresses Copilot review feedback on the build-sea command:

- parseTarget() previously destructured the target string, silently
  accepting extra `-` segments. Inputs like `linux-x64-musl-../../outside`
  would pass validation and flow into path.join. Validation is now done
  with a strict anchored regex.
- --output-name was passed into path.join() without sanitization, so a
  caller could escape the output directory with path separators or `..`.
  validateOutputName() now rejects anything that isn't a plain basename.
- The per-target SEA config file was written to a predictable path under
  os.tmpdir() (derived from the target name and Date.now()), which is
  unsafe on multi-user systems. It now lives inside a fresh mkdtemp()
  directory and is opened with the exclusive "wx" flag.
- New test cases cover extra-segment targets, uppercase/whitespace
  variants, and the full matrix of invalid --output-name inputs.

* rename: build-sea → pack-app

`build-sea` required knowing what a SEA is. `pack-app` is self-describing,
doesn't collide with pnpm's existing `bin` concept, and parallels the
existing `pack` command.

- Command name: build-sea → pack-app
- Default output dir: dist-sea → dist-app
- Error codes: PACK_APP_* (was BUILD_SEA_*)
- Export/type: packApp / PackAppOptions (was buildSea / BuildSeaOptions)
- Install cache dir: $PNPM_HOME/pack-app (was $PNPM_HOME/build-sea)

The Node.js `--build-sea` flag name itself is unchanged — that's a
Node.js feature and outside this project's naming.

* fix(pack-app): reject directory entries, pin builder to >=25.5, refuse macOS target on Windows

Addresses Copilot review feedback on the pack-app command:

- entry validation now rejects non-file paths (directories, symlinks to
  non-files) with a dedicated PACK_APP_ENTRY_NOT_FILE instead of
  surfacing a less actionable error later in the SEA build.
- DEFAULT_BUILDER_SPEC was the bare major ("25"), which would satisfy
  with 25.0.x if that version is still present — those point releases
  predate --build-sea support. Tightened to ">=25.5.0 <26.0.0" so the
  download is guaranteed to support the flag without ever crossing a
  major.
- adHocSignMacBinary() silently skipped re-signing on Windows hosts.
  Now throws PACK_APP_MACOS_SIGN_UNSUPPORTED_HOST with a hint to build
  the target on macOS/Linux or re-sign manually.
- resolvePlatformSelector() JSDoc now matches what the code actually
  does (picks the first entry when it is not "current"; later entries
  are ignored).
- New test case covers the directory-as-entry rejection.

* refactor(pack-app): switch target OS names to process.platform constants

Previously `pack-app` accepted `macos-*` / `win-*` as the OS portion of a
target triplet and translated them to `darwin` / `win32` internally. The
translation layer made the CLI surface inconsistent with the values that
`pnpm add --os=…` and `supportedArchitectures.os` already use, and added
a small footgun (e.g. users setting `supportedArchitectures: { os: [darwin] }`
but typing `macos-arm64` for pack-app).

The supported target OS set is now `linux | darwin | win32`, matching
`process.platform`. Old inputs like `macos-arm64` or `win-x64` now fail
validation with a clear error pointing to the new naming. The internal
parseTarget helper drops its TARGET_OS_MAP lookup entirely.

This is a change to an unreleased command so there is no back-compat
concern. pnpm's own artifact directory names (`pnpm/artifacts/macos-*/`,
`pnpm/artifacts/win-*/`) are an internal implementation detail and are
not affected by this change.

* feat(pack-app): read defaults from pnpm.app in package.json

Every pack-app flag (--entry, --target, --node-version, --output-dir,
--output-name) can now be preconfigured in the project's package.json
under a new "pnpm.app" object:

  {
    "name": "my-cli",
    "pnpm": {
      "app": {
        "entry": "dist/index.cjs",
        "targets": ["linux-x64", "darwin-arm64", "win32-x64"],
        "nodeVersion": "25",
        "outputDir": "release",
        "outputName": "my-cli"
      }
    }
  }

CLI flags always win. --target replaces the configured list rather than
appending, so a user can narrow the default set at the command line.

The config loader is strict: unknown keys under pnpm.app and any
type-mismatched values throw PACK_APP_INVALID_CONFIG so mistakes surface
at invocation time instead of silently being ignored.

Chose pnpm.app over pnpm.packApp because it's the shorter, cleaner
namespace for anything related to the app bundle (future sibling
commands like run-app / deploy-app could share the same object without
a naming clash). Chose package.json over pnpm-workspace.yaml because
the config is inherently per-project, whereas pnpm-workspace.yaml is
workspace-root-only.

* fix(pack-app): deterministic libc selection and stricter output-name validation

Addresses Copilot review feedback:

- ensureNodeRuntime() now always passes an explicit --libc for linux
  targets. Without a suffix, linux-x64 and linux-arm64 default to
  --libc=glibc instead of letting the user's supportedArchitectures.libc
  config or the host's detected libc decide the variant. The install
  cache directory mirrors this, so glibc and musl variants are always
  distinct (linux-x64-glibc vs linux-x64-musl).
- resolveBuilderBinary() now pins the host libc when downloading a
  builder Node on Linux. A user whose config sets supportedArchitectures.libc
  to musl no longer ends up with a musl Node that the glibc host cannot
  execute.
- validateOutputName() rejects Windows-invalid filename characters
  (<>:"|?* and NUL), Windows reserved device names (CON, NUL, COM1, etc.),
  and names ending in a dot or space — problems surface at invocation
  time rather than during writeFile(outputFile, ...) on Windows.
- lockfileToDepGraph variants tests no longer derive the "host"
  variant from process.platform/process.arch; they always pass an
  explicit supportedArchitectures selector so the expectations hold on
  any CI host (including Alpine/musl).

* chore: add "toctou" to cspell wordlist

`TOCTOU` (time-of-check-to-time-of-use) is the standard term for the
race-condition class the pack-app SEA-config comment describes. Adding
it to the wordlist unblocks the Lint CI step.

* fix: lint
This commit is contained in:
Zoltan Kochan
2026-04-20 14:29:49 +02:00
committed by GitHub
parent bcc88a1239
commit 72c1e050e9
25 changed files with 1051 additions and 60 deletions

View File

@@ -0,0 +1,6 @@
---
"@pnpm/releasing.commands": minor
"pnpm": minor
---
Added a new `pnpm pack-app` command that packs a CommonJS entry file into a standalone executable for one or more target platforms, using the [Node.js Single Executable Applications](https://nodejs.org/api/single-executable-applications.html) API under the hood. Targets are specified as `<os>-<arch>[-<libc>]` (e.g. `linux-x64`, `linux-x64-musl`, `macos-arm64`, `win-x64`) and each produces an executable under `dist-app/<target>/` by default. Requires Node.js v25.5+ to perform the injection; an older host downloads Node.js v25 automatically.

View File

@@ -0,0 +1,13 @@
---
"@pnpm/deps.graph-hasher": minor
"@pnpm/resolving.resolver-base": minor
"@pnpm/installing.deps-installer": patch
"@pnpm/installing.deps-resolver": patch
"@pnpm/installing.deps-restorer": patch
"@pnpm/installing.package-requester": patch
"@pnpm/building.after-install": patch
"@pnpm/deps.graph-builder": patch
"pnpm": patch
---
Fix: different platform variants of the same runtime (e.g. `node@runtime:25.9.0` glibc vs. musl) no longer share a single global-virtual-store entry. The virtual store path now incorporates the selected variant's integrity, so installs with different `--os`/`--cpu`/`--libc` end up in separate directories and `pnpm add --libc=musl node@runtime:<v>` reliably fetches the musl binary even when the glibc variant is already cached.

View File

@@ -4,7 +4,7 @@ import { DEFAULT_REGISTRIES, normalizeRegistries } from '@pnpm/config.normalize-
import type { Config, ConfigContext } from '@pnpm/config.reader'
import type { LogBase } from '@pnpm/logger'
import type { StoreController } from '@pnpm/store.controller-types'
import type { Registries, RegistryConfig } from '@pnpm/types'
import type { Registries, RegistryConfig, SupportedArchitectures } from '@pnpm/types'
import { loadJsonFile } from 'load-json-file'
export type StrictBuildOptions = {
@@ -51,6 +51,7 @@ export type StrictBuildOptions = {
peersSuffixMaxLength: number
strictStorePkgContentCheck: boolean
fetchFullMetadata?: boolean
supportedArchitectures?: SupportedArchitectures
} & Pick<Config, 'allowBuilds'>
export type BuildOptions = Partial<StrictBuildOptions> &

View File

@@ -279,7 +279,7 @@ async function _rebuild (
} & Pick<PnpmContext, 'modulesFile'>,
opts: StrictBuildOptions
): Promise<{ pkgsThatWereRebuilt: Set<string>, ignoredPkgs: IgnoredBuilds }> {
const depGraph = lockfileToDepGraph(ctx.currentLockfile)
const depGraph = lockfileToDepGraph(ctx.currentLockfile, opts.supportedArchitectures)
const depsStateCache: DepsStateCache = {}
const pkgsThatWereRebuilt = new Set<string>()
const graph = new Map()
@@ -365,6 +365,7 @@ async function _rebuild (
if (pkgFilesIndex) {
sideEffectsCacheKey = calcDepState(depGraph, depsStateCache, depPath, {
includeDepGraphHash: true,
supportedArchitectures: opts.supportedArchitectures,
})
if (pkgFilesIndex.sideEffects?.has(sideEffectsCacheKey)) {
pkgsThatWereRebuilt.add(depPath)

View File

@@ -328,6 +328,7 @@
"tempy",
"testcase",
"TLSV",
"toctou",
"todomvc",
"toplevel",
"tsgo",

View File

@@ -15,7 +15,7 @@ import type { LockfileObject } from '@pnpm/lockfile.fs'
import {
nameVerFromPkgSnapshot,
} from '@pnpm/lockfile.utils'
import type { AllowBuild, DepPath } from '@pnpm/types'
import type { AllowBuild, DepPath, SupportedArchitectures } from '@pnpm/types'
interface PkgSnapshotWithLocation {
pkgMeta: PkgMetaAndSnapshot
@@ -28,16 +28,17 @@ export function * iteratePkgsForVirtualStore (lockfile: LockfileObject, opts: {
virtualStoreDirMaxLength: number
virtualStoreDir: string
globalVirtualStoreDir: string
supportedArchitectures?: SupportedArchitectures
}): IterableIterator<PkgSnapshotWithLocation> {
if (opts.enableGlobalVirtualStore) {
for (const { hash, pkgMeta } of hashDependencyPaths(lockfile, opts.allowBuild)) {
for (const { hash, pkgMeta } of hashDependencyPaths(lockfile, opts.allowBuild, opts.supportedArchitectures)) {
yield {
dirInVirtualStore: path.join(opts.globalVirtualStoreDir, hash),
pkgMeta,
}
}
} else if (lockfile.packages) {
let graphNodeHashOpts: { graph: DepsGraph<DepPath>, cache: DepsStateCache } | undefined
let graphNodeHashOpts: { graph: DepsGraph<DepPath>, cache: DepsStateCache, supportedArchitectures?: SupportedArchitectures } | undefined
for (const depPath in lockfile.packages) {
if (!Object.hasOwn(lockfile.packages, depPath)) {
continue
@@ -55,7 +56,8 @@ export function * iteratePkgsForVirtualStore (lockfile: LockfileObject, opts: {
if (dp.isRuntimeDepPath(depPath as DepPath)) {
graphNodeHashOpts ??= {
cache: {},
graph: lockfileToDepGraph(lockfile),
graph: lockfileToDepGraph(lockfile, opts.supportedArchitectures),
supportedArchitectures: opts.supportedArchitectures,
}
const hash = calcGraphNodeHash(graphNodeHashOpts, pkgMeta)
dirInVirtualStore = path.join(opts.globalVirtualStoreDir, hash)
@@ -70,7 +72,11 @@ export function * iteratePkgsForVirtualStore (lockfile: LockfileObject, opts: {
}
}
function hashDependencyPaths (lockfile: LockfileObject, allowBuild?: AllowBuild): IterableIterator<HashedDepPath<PkgMetaAndSnapshot>> {
const graph = lockfileToDepGraph(lockfile)
return iterateHashedGraphNodes(graph, iteratePkgMeta(lockfile, graph), allowBuild)
function hashDependencyPaths (
lockfile: LockfileObject,
allowBuild?: AllowBuild,
supportedArchitectures?: SupportedArchitectures
): IterableIterator<HashedDepPath<PkgMetaAndSnapshot>> {
const graph = lockfileToDepGraph(lockfile, supportedArchitectures)
return iterateHashedGraphNodes(graph, iteratePkgMeta(lockfile, graph), allowBuild, supportedArchitectures)
}

View File

@@ -36,7 +36,9 @@
"@pnpm/deps.path": "workspace:*",
"@pnpm/lockfile.types": "workspace:*",
"@pnpm/lockfile.utils": "workspace:*",
"@pnpm/types": "workspace:*"
"@pnpm/resolving.resolver-base": "workspace:*",
"@pnpm/types": "workspace:*",
"detect-libc": "catalog:"
},
"devDependencies": {
"@pnpm/deps.graph-hasher": "workspace:*"

View File

@@ -3,7 +3,9 @@ import { hashObject, hashObjectWithoutSorting } from '@pnpm/crypto.object-hasher
import { getPkgIdWithPatchHash, refToRelative } from '@pnpm/deps.path'
import type { LockfileObject, LockfileResolution, PackageSnapshot } from '@pnpm/lockfile.types'
import { nameVerFromPkgSnapshot } from '@pnpm/lockfile.utils'
import type { AllowBuild, DepPath, PkgIdWithPatchHash } from '@pnpm/types'
import { resolvePlatformSelector, selectPlatformVariant } from '@pnpm/resolving.resolver-base'
import type { AllowBuild, DepPath, PkgIdWithPatchHash, SupportedArchitectures } from '@pnpm/types'
import { familySync } from 'detect-libc'
export type DepsGraph<T extends string> = Record<T, DepsGraphNode<T>>
@@ -27,11 +29,12 @@ export function calcDepState<T extends string> (
opts: {
patchFileHash?: string
includeDepGraphHash: boolean
supportedArchitectures?: SupportedArchitectures
}
): string {
let result = ENGINE_NAME
if (opts.includeDepGraphHash) {
const depGraphHash = calcDepGraphHash(depsGraph, cache, new Set(), depPath)
const depGraphHash = calcDepGraphHash(depsGraph, cache, new Set(), depPath, opts.supportedArchitectures)
result += `;deps=${depGraphHash}`
}
if (opts.patchFileHash) {
@@ -44,7 +47,8 @@ function calcDepGraphHash<T extends string> (
depsGraph: DepsGraph<T>,
cache: DepsStateCache,
parents: Set<string>,
depPath: T
depPath: T,
supportedArchitectures?: SupportedArchitectures
): string {
if (cache[depPath]) return cache[depPath]
const node = depsGraph[depPath]
@@ -56,16 +60,15 @@ function calcDepGraphHash<T extends string> (
if (!node.resolution) {
throw new Error(`resolution is not defined for ${depPath} in depsGraph`)
}
node.fullPkgId = createFullPkgId(node.pkgIdWithPatchHash, node.resolution)
node.fullPkgId = createFullPkgId(node.pkgIdWithPatchHash, node.resolution, supportedArchitectures)
}
const deps: Record<string, string> = {}
if (Object.keys(node.children).length && !parents.has(node.fullPkgId)) {
const nextParents = new Set([...Array.from(parents), node.fullPkgId])
const _calcDepGraphHash = calcDepGraphHash.bind(null, depsGraph, cache, nextParents)
for (const alias in node.children) {
if (Object.hasOwn(node.children, alias)) {
const childId = node.children[alias]
deps[alias] = _calcDepGraphHash(childId)
deps[alias] = calcDepGraphHash(depsGraph, cache, nextParents, childId, supportedArchitectures)
}
}
}
@@ -92,7 +95,8 @@ export interface HashedDepPath<T extends PkgMeta> {
export function * iterateHashedGraphNodes<T extends PkgMeta> (
graph: DepsGraph<DepPath>,
pkgMetaIterator: PkgMetaIterator<T>,
allowBuild?: AllowBuild
allowBuild?: AllowBuild,
supportedArchitectures?: SupportedArchitectures
): IterableIterator<HashedDepPath<T>> {
let builtDepPaths: Set<DepPath> | undefined
let entries: Iterable<T>
@@ -103,26 +107,28 @@ export function * iterateHashedGraphNodes<T extends PkgMeta> (
} else {
entries = pkgMetaIterator
}
const _calcGraphNodeHash = calcGraphNodeHash.bind(null, {
const ctx = {
graph,
cache: {},
builtDepPaths,
buildRequiredCache: builtDepPaths !== undefined ? {} : undefined,
})
supportedArchitectures,
}
for (const pkgMeta of entries) {
yield {
hash: _calcGraphNodeHash(pkgMeta),
hash: calcGraphNodeHash(ctx, pkgMeta),
pkgMeta,
}
}
}
export function calcGraphNodeHash<T extends PkgMeta> (
{ graph, cache, builtDepPaths, buildRequiredCache }: {
{ graph, cache, builtDepPaths, buildRequiredCache, supportedArchitectures }: {
graph: DepsGraph<DepPath>
cache: DepsStateCache
builtDepPaths?: Set<DepPath>
buildRequiredCache?: Record<string, boolean>
supportedArchitectures?: SupportedArchitectures
},
pkgMeta: T
): string {
@@ -135,7 +141,7 @@ export function calcGraphNodeHash<T extends PkgMeta> (
const includeEngine = builtDepPaths === undefined ||
transitivelyRequiresBuild(graph, builtDepPaths, buildRequiredCache ??= {}, depPath, new Set())
const engine = includeEngine ? ENGINE_NAME : null
const deps = calcDepGraphHash(graph, cache, new Set(), depPath)
const deps = calcDepGraphHash(graph, cache, new Set(), depPath, supportedArchitectures)
const hexDigest = hashObjectWithoutSorting({ engine, deps }, { encoding: 'hex' })
return formatGlobalVirtualStorePath(name, version, hexDigest)
}
@@ -179,7 +185,10 @@ export function * iteratePkgMeta (lockfile: LockfileObject, graph: DepsGraph<Dep
}
}
export function lockfileToDepGraph (lockfile: LockfileObject): DepsGraph<DepPath> {
export function lockfileToDepGraph (
lockfile: LockfileObject,
supportedArchitectures?: SupportedArchitectures
): DepsGraph<DepPath> {
const graph: DepsGraph<DepPath> = {}
if (lockfile.packages != null) {
for (const [depPath, pkgSnapshot] of Object.entries(lockfile.packages)) {
@@ -189,7 +198,7 @@ export function lockfileToDepGraph (lockfile: LockfileObject): DepsGraph<DepPath
})
graph[depPath as DepPath] = {
children,
fullPkgId: createFullPkgId(getPkgIdWithPatchHash(depPath as DepPath), pkgSnapshot.resolution),
fullPkgId: createFullPkgId(getPkgIdWithPatchHash(depPath as DepPath), pkgSnapshot.resolution, supportedArchitectures),
}
}
}
@@ -251,7 +260,33 @@ function lockfileDepsToGraphChildren (deps: Record<string, string>): Record<stri
return children
}
function createFullPkgId (pkgIdWithPatchHash: PkgIdWithPatchHash, resolution: LockfileResolution): string {
const res = 'integrity' in resolution ? String(resolution.integrity) : hashObject(resolution)
return `${pkgIdWithPatchHash}:${res}`
function createFullPkgId (
pkgIdWithPatchHash: PkgIdWithPatchHash,
resolution: LockfileResolution,
supportedArchitectures?: SupportedArchitectures
): string {
if ('integrity' in resolution && resolution.integrity != null) {
return `${pkgIdWithPatchHash}:${resolution.integrity}`
}
if ('type' in resolution && resolution.type === 'variations') {
// Variations resolutions list every platform variant for a runtime (e.g. all
// OS/arch combinations for a Node.js version). Hashing the whole object
// would be identical across hosts, so two projects that install different
// variants of the same runtime would collide on the same virtual store
// directory — the first install would "win" and subsequent installs with
// different --os/--cpu/--libc would silently reuse the cached variant.
// Incorporate the chosen variant's integrity instead so each variant gets
// its own entry in the global virtual store.
const selector = resolvePlatformSelector(supportedArchitectures, {
platform: process.platform,
arch: process.arch,
libc: familySync(),
})
const variant = selectPlatformVariant(resolution.variants, selector)
const chosenResolution = variant?.resolution
if (chosenResolution && 'integrity' in chosenResolution && chosenResolution.integrity != null) {
return `${pkgIdWithPatchHash}:${chosenResolution.integrity}`
}
}
return `${pkgIdWithPatchHash}:${hashObject(resolution)}`
}

View File

@@ -1,4 +1,5 @@
import { lockfileToDepGraph } from '@pnpm/deps.graph-hasher'
import type { BinaryResolution } from '@pnpm/resolving.resolver-base'
import type { DepPath } from '@pnpm/types'
test('lockfileToDepGraph', () => {
@@ -51,3 +52,81 @@ test('lockfileToDepGraph', () => {
},
})
})
describe('lockfileToDepGraph with variations resolution', () => {
const glibcVariantIntegrity = 'sha256-glibc=='
const muslVariantIntegrity = 'sha256-musl=='
const darwinVariantIntegrity = 'sha256-darwin=='
// Always-explicit selectors — don't rely on process.platform / host libc so
// these tests produce the same result on glibc, musl, macOS, and Windows CI.
const linuxGlibcSelector = { os: ['linux'], cpu: ['x64'], libc: ['glibc'] }
const linuxMuslSelector = { os: ['linux'], cpu: ['x64'], libc: ['musl'] }
const darwinSelector = { os: ['darwin'], cpu: ['arm64'] }
function variantResolution (integrity: string): BinaryResolution {
return {
type: 'binary',
archive: 'tarball',
bin: 'bin/node',
integrity,
url: `https://example.com/${integrity}.tar.gz`,
}
}
const pkgWithVariants = {
resolution: {
type: 'variations' as const,
variants: [
{
// Linux default (glibc) — variant has no libc marker.
targets: [{ os: 'linux', cpu: 'x64' }],
resolution: variantResolution(glibcVariantIntegrity),
},
{
targets: [{ os: 'linux', cpu: 'x64', libc: 'musl' as const }],
resolution: variantResolution(muslVariantIntegrity),
},
{
targets: [{ os: 'darwin', cpu: 'arm64' }],
resolution: variantResolution(darwinVariantIntegrity),
},
],
},
}
function graphFor (selector: Parameters<typeof lockfileToDepGraph>[1]) {
return lockfileToDepGraph(
{
lockfileVersion: '9.0',
importers: {},
packages: {
['node@runtime:22.0.0' as DepPath]: pkgWithVariants,
},
},
selector
)
}
test('picks the linux glibc variant when supportedArchitectures matches it', () => {
expect(graphFor(linuxGlibcSelector)['node@runtime:22.0.0' as DepPath].fullPkgId)
.toBe(`node@runtime:22.0.0:${glibcVariantIntegrity}`)
})
test('picks the linux musl variant when supportedArchitectures.libc=musl', () => {
expect(graphFor(linuxMuslSelector)['node@runtime:22.0.0' as DepPath].fullPkgId)
.toBe(`node@runtime:22.0.0:${muslVariantIntegrity}`)
})
test('picks the darwin variant when supportedArchitectures.os=darwin', () => {
expect(graphFor(darwinSelector)['node@runtime:22.0.0' as DepPath].fullPkgId)
.toBe(`node@runtime:22.0.0:${darwinVariantIntegrity}`)
})
test('different variants produce different fullPkgIds for the same runtime version', () => {
const glibc = graphFor(linuxGlibcSelector)['node@runtime:22.0.0' as DepPath].fullPkgId
const musl = graphFor(linuxMuslSelector)['node@runtime:22.0.0' as DepPath].fullPkgId
const darwin = graphFor(darwinSelector)['node@runtime:22.0.0' as DepPath].fullPkgId
expect(new Set([glibc, musl, darwin]).size).toBe(3)
})
})

View File

@@ -24,6 +24,9 @@
{
"path": "../../lockfile/utils"
},
{
"path": "../../resolving/resolver-base"
},
{
"path": "../path"
}

View File

@@ -1427,6 +1427,7 @@ const _installInContext: InstallFunction = async (projects, ctx, opts) => {
wantedToBeSkippedPackageIds,
hoistWorkspacePackages: opts.hoistWorkspacePackages,
virtualStoreOnly: opts.virtualStoreOnly,
supportedArchitectures: opts.supportedArchitectures,
}
)
stats = result.stats

View File

@@ -30,6 +30,7 @@ import type {
HoistedDependencies,
ProjectId,
Registries,
SupportedArchitectures,
} from '@pnpm/types'
import { symlinkAllModules } from '@pnpm/worker'
import pLimit from 'p-limit'
@@ -74,6 +75,7 @@ export interface LinkPackagesOptions {
wantedToBeSkippedPackageIds: Set<string>
hoistWorkspacePackages?: boolean
virtualStoreOnly: boolean
supportedArchitectures?: SupportedArchitectures
}
export interface LinkPackagesResult {
@@ -160,6 +162,7 @@ export async function linkPackages (projects: ImporterToUpdate[], depGraph: Depe
symlink: opts.symlink,
skipped: opts.skipped,
storeController: opts.storeController,
supportedArchitectures: opts.supportedArchitectures,
virtualStoreDir: opts.virtualStoreDir,
}
)
@@ -331,6 +334,7 @@ interface LinkNewPackagesOptions {
symlink: boolean
skipped: Set<DepPath>
storeController: StoreController
supportedArchitectures?: SupportedArchitectures
virtualStoreDir: string
}
@@ -408,6 +412,7 @@ async function linkNewPackages (
ignoreScripts: opts.ignoreScripts,
lockfileDir: opts.lockfileDir,
sideEffectsCacheRead: opts.sideEffectsCacheRead,
supportedArchitectures: opts.supportedArchitectures,
}),
])
@@ -463,6 +468,7 @@ async function linkAllPkgs (
ignoreScripts: boolean
lockfileDir: string
sideEffectsCacheRead: boolean
supportedArchitectures?: SupportedArchitectures
}
): Promise<void> {
await Promise.all(
@@ -476,6 +482,7 @@ async function linkAllPkgs (
sideEffectsCacheKey = calcDepState(opts.depGraph, opts.depsStateCache, depNode.depPath, {
includeDepGraphHash: !opts.ignoreScripts && depNode.requiresBuild, // true when is built
patchFileHash: depNode.patch?.hash,
supportedArchitectures: opts.supportedArchitectures,
})
}
}

View File

@@ -28,6 +28,7 @@ import {
type ProjectId,
type ProjectManifest,
type ProjectRootDir,
type SupportedArchitectures,
} from '@pnpm/types'
import { isSubdir } from 'is-subdir'
import { difference, zipWith } from 'ramda'
@@ -487,12 +488,13 @@ function extendGraph (
allowBuild?: AllowBuild
globalVirtualStoreDir: string
enableGlobalVirtualStore?: boolean
supportedArchitectures?: SupportedArchitectures
}
): DependenciesGraph {
const pkgMetaIter = iterateGraphPkgMetaEntries(graph, !opts.enableGlobalVirtualStore)
// Only use allowBuild for engine-agnostic hash optimization when GVS is on
const allowBuild = opts.enableGlobalVirtualStore ? opts.allowBuild : undefined
for (const { pkgMeta: { depPath }, hash } of iterateHashedGraphNodes(graph, pkgMetaIter, allowBuild)) {
for (const { pkgMeta: { depPath }, hash } of iterateHashedGraphNodes(graph, pkgMetaIter, allowBuild, opts.supportedArchitectures)) {
const modules = path.join(opts.globalVirtualStoreDir, hash, 'node_modules')
const node = graph[depPath]
Object.assign(node, {

View File

@@ -413,6 +413,7 @@ export async function headlessInstall (opts: HeadlessOptions): Promise<Installat
lockfileDir: opts.lockfileDir,
preferSymlinkedExecutables: opts.preferSymlinkedExecutables,
sideEffectsCacheRead: opts.sideEffectsCacheRead,
supportedArchitectures: opts.supportedArchitectures,
})
stageLogger.debug({
prefix: lockfileDir,
@@ -451,6 +452,7 @@ export async function headlessInstall (opts: HeadlessOptions): Promise<Installat
lockfileDir: opts.lockfileDir,
sideEffectsCacheRead: opts.sideEffectsCacheRead,
storeDir: opts.storeDir,
supportedArchitectures: opts.supportedArchitectures,
}),
])
}
@@ -889,6 +891,7 @@ async function linkAllPkgs (
lockfileDir: string
sideEffectsCacheRead: boolean
storeDir: string
supportedArchitectures?: SupportedArchitectures
}
): Promise<void> {
// Create a marker source file that will be added to filesMap for GVS packages
@@ -917,6 +920,7 @@ async function linkAllPkgs (
sideEffectsCacheKey = calcDepState(opts.depGraph, opts.depsStateCache, depNode.dir, {
includeDepGraphHash: !opts.ignoreScripts && depNode.requiresBuild, // true when is built
patchFileHash: depNode.patch?.hash,
supportedArchitectures: opts.supportedArchitectures,
})
}
}

View File

@@ -16,7 +16,7 @@ import type {
PackageFilesResponse,
StoreController,
} from '@pnpm/store.controller-types'
import type { AllowBuild } from '@pnpm/types'
import type { AllowBuild, SupportedArchitectures } from '@pnpm/types'
import { rimraf } from '@zkochan/rimraf'
import pLimit from 'p-limit'
import { difference, isEmpty } from 'ramda'
@@ -37,6 +37,7 @@ export async function linkHoistedModules (
lockfileDir: string
preferSymlinkedExecutables?: boolean
sideEffectsCacheRead: boolean
supportedArchitectures?: SupportedArchitectures
}
): Promise<void> {
// TODO: remove nested node modules first
@@ -97,10 +98,10 @@ async function linkAllPkgsInOrder (
lockfileDir: string
preferSymlinkedExecutables?: boolean
sideEffectsCacheRead: boolean
supportedArchitectures?: SupportedArchitectures
warn: (message: string) => void
}
): Promise<void> {
const _calcDepState = calcDepState.bind(null, graph, opts.depsStateCache)
await Promise.all(
Object.entries(hierarchy).map(async ([dir, deps]) => {
const depNode = graph[dir]
@@ -117,9 +118,10 @@ async function linkAllPkgsInOrder (
let sideEffectsCacheKey: string | undefined
if (opts.sideEffectsCacheRead && filesResponse.sideEffectsMaps && !isEmpty(filesResponse.sideEffectsMaps)) {
if (opts.allowBuild?.(depNode.name, depNode.version) === true) {
sideEffectsCacheKey = _calcDepState(dir, {
sideEffectsCacheKey = calcDepState(graph, opts.depsStateCache, dir, {
includeDepGraphHash: !opts.ignoreScripts && depNode.requiresBuild, // true when is built
patchFileHash: depNode.patch?.hash,
supportedArchitectures: opts.supportedArchitectures,
})
}
}

View File

@@ -15,15 +15,17 @@ import { pickFetcher } from '@pnpm/fetching.pick-fetcher'
import gfs from '@pnpm/fs.graceful-fs'
import type { CustomFetcher } from '@pnpm/hooks.types'
import { logger } from '@pnpm/logger'
import type {
AtomicResolution,
DirectoryResolution,
PlatformAssetResolution,
PreferredVersions,
Resolution,
ResolveFunction,
ResolveResult,
TarballResolution,
import {
type AtomicResolution,
type DirectoryResolution,
type PlatformAssetResolution,
type PreferredVersions,
type Resolution,
type ResolveFunction,
resolvePlatformSelector,
type ResolveResult,
selectPlatformVariant,
type TarballResolution,
} from '@pnpm/resolving.resolver-base'
import {
normalizeBundledManifest,
@@ -368,28 +370,17 @@ function getFilesIndexFilePath (
}
function findResolution (resolutionVariants: PlatformAssetResolution[], supportedArchitectures?: SupportedArchitectures): AtomicResolution {
const platform = getOneIfNonCurrent(supportedArchitectures?.os) ?? process.platform
const cpu = getOneIfNonCurrent(supportedArchitectures?.cpu) ?? process.arch
const libc = getOneIfNonCurrent(supportedArchitectures?.libc) ?? getLibcFamilySync()
const resolutionVariant = resolutionVariants
.find((resolutionVariant) => resolutionVariant.targets.some(
(target) =>
target.os === platform &&
target.cpu === cpu &&
(target.libc == null || target.libc === libc)
))
if (!resolutionVariant) {
const selector = resolvePlatformSelector(supportedArchitectures, {
platform: process.platform,
arch: process.arch,
libc: getLibcFamilySync(),
})
const variant = selectPlatformVariant(resolutionVariants, selector)
if (!variant) {
const resolutionTargets = resolutionVariants.map((variant) => variant.targets)
throw new PnpmError('NO_RESOLUTION_MATCHED', `Cannot find a resolution variant for the current platform in these resolutions: ${JSON.stringify(resolutionTargets)}`)
}
return resolutionVariant.resolution
}
function getOneIfNonCurrent (requirements: string[] | undefined): string | undefined {
if (requirements?.length && requirements[0] !== 'current') {
return requirements[0]
}
return undefined
return variant.resolution
}
function fetchToStore (

15
pnpm-lock.yaml generated
View File

@@ -3185,9 +3185,15 @@ importers:
'@pnpm/lockfile.utils':
specifier: workspace:*
version: link:../../lockfile/utils
'@pnpm/resolving.resolver-base':
specifier: workspace:*
version: link:../../resolving/resolver-base
'@pnpm/types':
specifier: workspace:*
version: link:../../core/types
detect-libc:
specifier: 'catalog:'
version: 2.1.2
devDependencies:
'@pnpm/deps.graph-hasher':
specifier: workspace:*
@@ -7772,12 +7778,18 @@ importers:
'@pnpm/engine.runtime.commands':
specifier: workspace:*
version: link:../../engine/runtime/commands
'@pnpm/engine.runtime.node-resolver':
specifier: workspace:*
version: link:../../engine/runtime/node-resolver
'@pnpm/error':
specifier: workspace:*
version: link:../../core/error
'@pnpm/exec.lifecycle':
specifier: workspace:*
version: link:../../exec/lifecycle
'@pnpm/exec.pnpm-cli-runner':
specifier: workspace:*
version: link:../../exec/pnpm-cli-runner
'@pnpm/fetching.directory-fetcher':
specifier: workspace:*
version: link:../../fetching/directory-fetcher
@@ -7841,6 +7853,9 @@ importers:
ci-info:
specifier: 'catalog:'
version: 4.4.0
detect-libc:
specifier: 'catalog:'
version: 2.1.2
enquirer:
specifier: 'catalog:'
version: 2.4.1

View File

@@ -19,7 +19,7 @@ import {
import { add, dedupe, fetch, importCommand, install, link, prune, remove, unlink, update } from '@pnpm/installing.commands'
import { patch, patchCommit, patchRemove } from '@pnpm/patching.commands'
import { deprecate, distTag, ping, search, star, stars, undeprecate, unpublish, unstar, whoami } from '@pnpm/registry-access.commands'
import { deploy, pack, publish, version } from '@pnpm/releasing.commands'
import { deploy, pack, packApp, publish, version } from '@pnpm/releasing.commands'
import { catFile, catIndex, findHash, store } from '@pnpm/store.commands'
import { init } from '@pnpm/workspace.commands'
import { pick } from 'ramda'
@@ -157,6 +157,7 @@ const commands: CommandDefinition[] = [
licenses,
outdated,
pack,
packApp,
patch,
patchCommit,
patchRemove,

View File

@@ -41,8 +41,10 @@
"@pnpm/constants": "workspace:*",
"@pnpm/deps.path": "workspace:*",
"@pnpm/engine.runtime.commands": "workspace:*",
"@pnpm/engine.runtime.node-resolver": "workspace:*",
"@pnpm/error": "workspace:*",
"@pnpm/exec.lifecycle": "workspace:*",
"@pnpm/exec.pnpm-cli-runner": "workspace:*",
"@pnpm/fetching.directory-fetcher": "workspace:*",
"@pnpm/fs.indexed-pkg-importer": "workspace:*",
"@pnpm/fs.is-empty-dir-or-nothing": "workspace:*",
@@ -64,6 +66,7 @@
"@zkochan/rimraf": "catalog:",
"chalk": "catalog:",
"ci-info": "catalog:",
"detect-libc": "catalog:",
"enquirer": "catalog:",
"execa": "catalog:",
"libnpmpublish": "catalog:",

View File

@@ -1,3 +1,4 @@
export { deploy } from './deploy/index.js'
export { packApp } from './pack-app/index.js'
export { pack, publish } from './publish/index.js'
export { version } from './version/index.js'

View File

@@ -0,0 +1,3 @@
import * as packApp from './packApp.js'
export { packApp }

View File

@@ -0,0 +1,525 @@
import fs from 'node:fs'
import { mkdir, mkdtemp, readFile, rm, writeFile } from 'node:fs/promises'
import os from 'node:os'
import path from 'node:path'
import { docsUrl } from '@pnpm/cli.utils'
import type { Config } from '@pnpm/config.reader'
import {
getNodeMirror,
parseNodeSpecifier,
resolveNodeVersion,
} from '@pnpm/engine.runtime.node-resolver'
import { PnpmError } from '@pnpm/error'
import { runPnpmCli } from '@pnpm/exec.pnpm-cli-runner'
import { createFetchFromRegistry } from '@pnpm/network.fetch'
import { familySync } from 'detect-libc'
import { safeExeca as execa } from 'execa'
import { renderHelp } from 'render-help'
/** Minimum Node.js version that supports `node --build-sea`. */
const MIN_BUILDER_VERSION = { major: 25, minor: 5 } as const
// Range to download when the running Node is too old. Constrained to the
// current major so we don't silently jump majors across releases, and pinned
// above MIN_BUILDER_VERSION.minor so older point releases (e.g. 25.0.x) that
// don't support `--build-sea` aren't picked.
const DEFAULT_BUILDER_SPEC = `>=${MIN_BUILDER_VERSION.major}.${MIN_BUILDER_VERSION.minor}.0 <${MIN_BUILDER_VERSION.major + 1}.0.0`
// Target OS names match `process.platform`. That keeps the CLI surface
// consistent with pnpm's own `--os` flag (which also takes platform constants)
// and with `supportedArchitectures.os` in pnpm-workspace.yaml.
const SUPPORTED_OS = ['linux', 'darwin', 'win32'] as const
const SUPPORTED_TARGETS =
'linux-x64, linux-x64-musl, linux-arm64, linux-arm64-musl, darwin-x64, darwin-arm64, win32-x64, win32-arm64'
export const commandNames = ['pack-app']
export function rcOptionsTypes (): Record<string, unknown> {
return {}
}
export function cliOptionsTypes (): Record<string, unknown> {
return {
entry: String,
target: [String, Array],
'node-version': String,
'output-dir': String,
'output-name': String,
}
}
export const shorthands: Record<string, string> = {
t: '--target',
o: '--output-dir',
}
export function help (): string {
return renderHelp({
description:
'Pack a CommonJS entry file into a standalone executable for one or more target platforms.\n\n' +
'The executable embeds a Node.js binary via the Node.js Single Executable Applications API.\n' +
`Requires Node.js v${MIN_BUILDER_VERSION.major}.${MIN_BUILDER_VERSION.minor}+ to perform ` +
'the injection. The running Node.js is used when it is new enough; otherwise, the ' +
`latest Node.js v${MIN_BUILDER_VERSION.major}.${MIN_BUILDER_VERSION.minor}+ in the ` +
`v${MIN_BUILDER_VERSION.major}.x line is downloaded automatically.\n\n` +
'Defaults for --entry, --target, --node-version, --output-dir, and --output-name can be ' +
'set in the package.json under "pnpm.app". CLI flags override the config; --target entirely ' +
'replaces the configured list so you can narrow it at invocation time.',
url: docsUrl('pack-app'),
usages: [
'pnpm pack-app --entry dist/index.cjs --target linux-x64 --target win32-x64',
'pnpm pack-app --entry dist/index.cjs --target linux-x64-musl --node-version 22',
],
descriptionLists: [
{
title: 'Options',
list: [
{
description: 'Path to the CJS entry file to embed in the executable',
name: '--entry',
},
{
description:
`Target to build for. May be specified multiple times. Supported: ${SUPPORTED_TARGETS}`,
name: '--target',
shortAlias: '-t',
},
{
description:
'Node.js version to embed in the output executables (e.g. "22", "22.0.0", "lts"). ' +
'Defaults to the running Node.js version.',
name: '--node-version',
},
{
description: 'Output directory for the built executables. Defaults to "dist-app".',
name: '--output-dir',
shortAlias: '-o',
},
{
description:
'Name for the output executable (without extension). Defaults to the unscoped package name.',
name: '--output-name',
},
],
},
],
})
}
export type PackAppOptions = Pick<Config,
| 'dir'
| 'pnpmHomeDir'
> & Partial<Pick<Config,
| 'ca'
| 'cert'
| 'configByUri'
| 'httpProxy'
| 'httpsProxy'
| 'key'
| 'localAddress'
| 'nodeDownloadMirrors'
| 'noProxy'
| 'strictSsl'
| 'userAgent'
>> & {
entry?: string
target?: string | string[]
nodeVersion?: string
outputDir?: string
outputName?: string
}
interface ParsedTarget {
raw: string
platform: string
arch: string
libc?: string
}
export async function handler (opts: PackAppOptions, params: string[]): Promise<string> {
// pnpm.app in package.json supplies defaults for every flag. CLI flags win,
// but `--target` entirely replaces the config list (additive merging would
// prevent narrowing from the CLI). See ProjectAppConfig below for the shape.
const project = await readProjectAppConfig(opts.dir)
const entryPath = opts.entry ?? params[0] ?? project.app?.entry
if (!entryPath) {
throw new PnpmError('PACK_APP_MISSING_ENTRY',
'"pnpm pack-app" requires a CJS entry file — pass --entry <path> or set "pnpm.app.entry" in package.json.')
}
const resolvedEntry = path.resolve(opts.dir, entryPath)
let entryStat: fs.Stats
try {
entryStat = fs.statSync(resolvedEntry)
} catch {
throw new PnpmError('PACK_APP_ENTRY_NOT_FOUND', `Entry file not found: ${resolvedEntry}`)
}
if (!entryStat.isFile()) {
throw new PnpmError('PACK_APP_ENTRY_NOT_FILE',
`Entry path must be a regular file: ${resolvedEntry}`)
}
const cliTargets = opts.target == null
? undefined
: Array.isArray(opts.target) ? opts.target : [opts.target]
const rawTargets = cliTargets ?? project.app?.targets ?? []
if (rawTargets.length === 0) {
throw new PnpmError('PACK_APP_MISSING_TARGET',
`"pnpm pack-app" requires at least one target — pass --target <triplet> or set "pnpm.app.targets" in package.json. Supported: ${SUPPORTED_TARGETS}`)
}
const targets = rawTargets.map(parseTarget)
const outputDir = path.resolve(opts.dir, opts.outputDir ?? project.app?.outputDir ?? 'dist-app')
await mkdir(outputDir, { recursive: true })
const outputName = validateOutputName(opts.outputName ?? project.app?.outputName ?? deriveOutputNameFromPackage(project, opts.dir))
const requestedNodeSpec = opts.nodeVersion ?? project.app?.nodeVersion ?? process.version.slice(1)
const fetch = createFetchFromRegistry(opts)
const buildRoot = path.join(opts.pnpmHomeDir, 'pack-app')
const builderBin = await resolveBuilderBinary({ fetch, nodeDownloadMirrors: opts.nodeDownloadMirrors, buildRoot })
const resolvedTargetVersion = await resolveVersion(fetch, requestedNodeSpec, opts.nodeDownloadMirrors)
const results: string[] = []
for (const target of targets) {
// eslint-disable-next-line no-await-in-loop
const embeddedNodeBin = await ensureNodeRuntime({
buildRoot,
version: resolvedTargetVersion,
platform: target.platform,
arch: target.arch,
libc: target.libc,
})
const targetOutputDir = path.join(outputDir, target.raw)
// eslint-disable-next-line no-await-in-loop
await mkdir(targetOutputDir, { recursive: true })
const outputFile = target.platform === 'win32'
? path.join(targetOutputDir, `${outputName}.exe`)
: path.join(targetOutputDir, outputName)
const seaConfig = {
main: resolvedEntry,
output: outputFile,
executable: embeddedNodeBin,
disableExperimentalSEAWarning: true,
useCodeCache: false,
useSnapshot: false,
}
// Write the SEA config into a fresh, unpredictable temp directory (0700
// by default) rather than a predictable path under os.tmpdir(). Avoids
// TOCTOU/symlink attacks on multi-user systems.
// eslint-disable-next-line no-await-in-loop
const tmpConfigDir = await mkdtemp(path.join(os.tmpdir(), 'pnpm-pack-app-'))
const configPath = path.join(tmpConfigDir, 'sea-config.json')
// eslint-disable-next-line no-await-in-loop
await writeFile(configPath, JSON.stringify(seaConfig, null, 2), { flag: 'wx' })
try {
// eslint-disable-next-line no-await-in-loop
await execa(builderBin, ['--build-sea', configPath], { stdio: 'inherit' })
} finally {
// eslint-disable-next-line no-await-in-loop
await rm(tmpConfigDir, { recursive: true, force: true }).catch(() => {})
}
// eslint-disable-next-line no-await-in-loop
await adHocSignMacBinary(target, outputFile)
results.push(` ${target.raw}: ${outputFile} (Node.js ${resolvedTargetVersion})`)
}
return `Built ${targets.length} executable${targets.length === 1 ? '' : 's'}:\n${results.join('\n')}`
}
/**
* Returns a Node.js binary that supports `--build-sea`. Prefers the running
* interpreter to avoid a download; falls back to downloading Node.js v25.
*/
async function resolveBuilderBinary (ctx: {
fetch: ReturnType<typeof createFetchFromRegistry>
nodeDownloadMirrors?: Record<string, string>
buildRoot: string
}): Promise<string> {
if (runningNodeCanBuildSea()) {
return process.execPath
}
const version = await resolveVersion(ctx.fetch, DEFAULT_BUILDER_SPEC, ctx.nodeDownloadMirrors)
return ensureNodeRuntime({
buildRoot: ctx.buildRoot,
version,
platform: process.platform,
arch: process.arch,
// Pin libc to the host's. Otherwise a caller that had set
// supportedArchitectures.libc=musl in their config would cause the
// glibc host to download a musl Node that it cannot execute.
libc: hostLinuxLibc(),
})
}
function hostLinuxLibc (): 'glibc' | 'musl' | undefined {
if (process.platform !== 'linux') return undefined
const family = familySync()
return family === 'musl' ? 'musl' : 'glibc'
}
function runningNodeCanBuildSea (): boolean {
const [majorStr, minorStr] = process.version.slice(1).split('.')
const major = Number(majorStr)
const minor = Number(minorStr)
return (
major > MIN_BUILDER_VERSION.major ||
(major === MIN_BUILDER_VERSION.major && minor >= MIN_BUILDER_VERSION.minor)
)
}
/**
* Fetches a Node.js runtime into a dedicated per-target directory under the
* pnpm home, reusing the cached binary if already present. Actual files are
* hardlinked from pnpm's content-addressable store, so repeated calls are
* cheap and `pnpm store prune` can reclaim them.
*/
async function ensureNodeRuntime (opts: {
buildRoot: string
version: string
platform: string
arch: string
libc?: string
}): Promise<string> {
// Linux variants always need a libc pin (glibc or musl) so that variant
// selection is deterministic and doesn't depend on the host's detected
// libc or the user's supportedArchitectures.libc config.
const libc = opts.platform === 'linux' ? opts.libc ?? 'glibc' : opts.libc
const targetId = [opts.platform, opts.arch, libc].filter(Boolean).join('-')
const installDir = path.join(opts.buildRoot, `${targetId}-${opts.version}`)
const nodeDir = path.join(installDir, 'node_modules', 'node')
const binaryPath = nodeBinaryPath(nodeDir, opts.platform)
if (fs.existsSync(binaryPath)) return binaryPath
await mkdir(installDir, { recursive: true })
await writeFile(
path.join(installDir, 'package.json'),
`${JSON.stringify({ name: `pnpm-pack-app-${targetId}`, private: true }, null, 2)}\n`
)
// Flags that select the target variant must come before the positional
// package spec; otherwise `pnpm add` silently installs the host variant.
const args = [
'add',
'--ignore-scripts',
'--ignore-workspace',
`--os=${opts.platform}`,
`--cpu=${opts.arch}`,
]
if (libc != null) {
args.push(`--libc=${libc}`)
}
args.push(`node@runtime:${opts.version}`)
runPnpmCli(args, { cwd: installDir })
if (!fs.existsSync(binaryPath)) {
throw new PnpmError('PACK_APP_NODE_BINARY_MISSING',
`Expected Node.js binary at ${binaryPath} after installing node@runtime:${opts.version}, but it was not found.`)
}
return binaryPath
}
function nodeBinaryPath (nodeDir: string, platform: string): string {
return platform === 'win32'
? path.join(nodeDir, 'node.exe')
: path.join(nodeDir, 'bin', 'node')
}
async function resolveVersion (
fetch: ReturnType<typeof createFetchFromRegistry>,
specifier: string,
nodeDownloadMirrors?: Record<string, string>
): Promise<string> {
const { releaseChannel, versionSpecifier } = parseNodeSpecifier(specifier)
const nodeMirrorBaseUrl = getNodeMirror(nodeDownloadMirrors, releaseChannel)
const version = await resolveNodeVersion(fetch, versionSpecifier, nodeMirrorBaseUrl)
if (!version) {
throw new PnpmError('PACK_APP_NODE_VERSION_NOT_FOUND',
`Could not find a Node.js version that satisfies "${specifier}"`)
}
return version
}
// Parsed triplet must match this shape exactly. We anchor and constrain each
// segment so that inputs like `linux-x64-musl-../../outside` are rejected
// outright — otherwise `target.raw` would later flow into path.join for the
// output directory and could escape it.
const TARGET_PATTERN = /^(linux|darwin|win32)-(x64|arm64)(?:-(musl))?$/
function parseTarget (raw: string): ParsedTarget {
const match = TARGET_PATTERN.exec(raw)
if (!match) {
throw new PnpmError('PACK_APP_INVALID_TARGET',
`Invalid target: "${raw}". Expected format: <os>-<arch>[-<libc>] where <os> is ${SUPPORTED_OS.join('|')}, <arch> is x64|arm64, optional <libc> is musl (linux only).`)
}
const [, platform, arch, libc] = match
if (libc === 'musl' && platform !== 'linux') {
throw new PnpmError('PACK_APP_INVALID_TARGET',
`The "musl" libc suffix is only valid for linux targets (got "${raw}").`)
}
return { raw, platform, arch, libc: libc || undefined }
}
// Characters that Win32 rejects in filenames, plus NUL. Path separators are
// checked separately via `path.basename` so the message is crisp.
const INVALID_FILENAME_CHARS = /[<>:"|?*\0]/
// Win32 reserved device names (case-insensitive, with or without an extension).
const RESERVED_WINDOWS_NAME = /^(?:con|prn|aux|nul|com[1-9]|lpt[1-9])(?:\..*)?$/i
// Reject anything that would let the output escape its target directory, or
// that would fail filesystem-level validation on any supported host. This
// surfaces problems at `pack-app` invocation time instead of letting them
// blow up later in `writeFile(outputFile, …)`.
function validateOutputName (name: string): string {
if (
name !== path.basename(name) ||
name === '' || name === '.' || name === '..' ||
name.includes('/') || name.includes('\\') ||
INVALID_FILENAME_CHARS.test(name) ||
RESERVED_WINDOWS_NAME.test(name) ||
/[. ]$/.test(name)
) {
throw new PnpmError('PACK_APP_INVALID_OUTPUT_NAME',
`Invalid --output-name "${name}". The name must be a plain filename without path separators, Windows-reserved names (e.g. CON, NUL), characters like <>:"|?* or NUL, and must not end in a dot or space.`)
}
return name
}
/** Fields pack-app reads from `pnpm.app` in package.json. */
export interface ProjectAppConfig {
entry?: string
targets?: string[]
nodeVersion?: string
outputDir?: string
outputName?: string
}
interface ReadProjectAppConfigResult {
name?: string
app?: ProjectAppConfig
}
// A narrow reader just for this command. Using readProjectManifest from
// @pnpm/cli.utils would pull in the installable/engine checks, which are
// irrelevant here: pack-app doesn't need the current project to be installable
// under the running Node, just to have a package.json with optional settings.
async function readProjectAppConfig (dir: string): Promise<ReadProjectAppConfigResult> {
let raw: string
try {
raw = await readFile(path.join(dir, 'package.json'), 'utf8')
} catch {
return {}
}
let manifest: unknown
try {
manifest = JSON.parse(raw)
} catch (err) {
throw new PnpmError('PACK_APP_INVALID_PACKAGE_JSON',
`Failed to parse ${path.join(dir, 'package.json')}: ${(err as Error).message}`)
}
if (!isObject(manifest)) return {}
const name = typeof manifest.name === 'string' && manifest.name !== '' ? manifest.name : undefined
const pnpmField = isObject(manifest.pnpm) ? manifest.pnpm : undefined
const appField = pnpmField && isObject(pnpmField.app) ? pnpmField.app : undefined
if (!appField) return { name }
return { name, app: validateAppConfig(appField) }
}
function validateAppConfig (raw: Record<string, unknown>): ProjectAppConfig {
const known = new Set(['entry', 'targets', 'nodeVersion', 'outputDir', 'outputName'])
for (const key of Object.keys(raw)) {
if (!known.has(key)) {
throw new PnpmError('PACK_APP_INVALID_CONFIG',
`Unknown "pnpm.app.${key}" setting in package.json. Allowed keys: ${Array.from(known).join(', ')}.`)
}
}
const config: ProjectAppConfig = {}
if (raw.entry != null) {
if (typeof raw.entry !== 'string') {
throw new PnpmError('PACK_APP_INVALID_CONFIG', '"pnpm.app.entry" must be a string.')
}
config.entry = raw.entry
}
if (raw.targets != null) {
if (!Array.isArray(raw.targets) || !raw.targets.every((t): t is string => typeof t === 'string')) {
throw new PnpmError('PACK_APP_INVALID_CONFIG', '"pnpm.app.targets" must be an array of strings.')
}
config.targets = raw.targets
}
if (raw.nodeVersion != null) {
if (typeof raw.nodeVersion !== 'string') {
throw new PnpmError('PACK_APP_INVALID_CONFIG', '"pnpm.app.nodeVersion" must be a string.')
}
config.nodeVersion = raw.nodeVersion
}
if (raw.outputDir != null) {
if (typeof raw.outputDir !== 'string') {
throw new PnpmError('PACK_APP_INVALID_CONFIG', '"pnpm.app.outputDir" must be a string.')
}
config.outputDir = raw.outputDir
}
if (raw.outputName != null) {
if (typeof raw.outputName !== 'string') {
throw new PnpmError('PACK_APP_INVALID_CONFIG', '"pnpm.app.outputName" must be a string.')
}
config.outputName = raw.outputName
}
return config
}
function deriveOutputNameFromPackage (project: ReadProjectAppConfigResult, dir: string): string {
if (!project.name) {
throw new PnpmError('PACK_APP_NO_OUTPUT_NAME',
`Could not determine the output name: package.json in ${dir} has no "name" field.`,
{ hint: 'Pass --output-name <name> or set "pnpm.app.outputName" in package.json.' }
)
}
// Strip @scope/ prefix from scoped packages so the binary name is a plain
// filename instead of "scope/name". The second validateOutputName() pass
// downstream rejects any leftover path separators.
return project.name.replace(/^@[^/]+\//, '')
}
function isObject (value: unknown): value is Record<string, unknown> {
return value != null && typeof value === 'object' && !Array.isArray(value)
}
/**
* SEA injection invalidates the existing code signature on macOS binaries, so
* the output must be re-signed. Native macOS hosts use `codesign`; Linux hosts
* cross-signing a darwin target use `ldid`. Windows hosts have no readily
* available ad-hoc signer, so we refuse to produce an unsigned output silently
* and tell the user to re-sign on macOS or Linux.
*/
async function adHocSignMacBinary (target: ParsedTarget, outputFile: string): Promise<void> {
if (target.platform !== 'darwin') return
if (process.platform === 'darwin') {
await execa('codesign', ['--sign', '-', outputFile], { stdio: 'inherit' })
return
}
if (process.platform === 'linux') {
try {
await execa('ldid', ['-S', outputFile], { stdio: 'inherit' })
} catch {
throw new PnpmError('PACK_APP_MACOS_SIGN_FAILED',
`Cross-compiled macOS binary at ${outputFile} could not be ad-hoc signed with "ldid".`,
{ hint: 'Install ldid (https://github.com/ProcursusTeam/ldid) or re-sign the binary on macOS with "codesign --sign - <file>".' }
)
}
return
}
throw new PnpmError('PACK_APP_MACOS_SIGN_UNSUPPORTED_HOST',
`Cannot ad-hoc sign the macOS binary at ${outputFile} on a ${process.platform} host.`,
{ hint: 'Build macOS targets on a macOS or Linux host, or re-sign the produced binary yourself with "codesign --sign -" on macOS.' }
)
}

View File

@@ -0,0 +1,223 @@
import fs from 'node:fs'
import os from 'node:os'
import path from 'node:path'
import { afterEach, beforeEach, describe, expect, it } from '@jest/globals'
import { packApp } from '../../src/index.js'
const { cliOptionsTypes, commandNames, handler, help, shorthands } = packApp
describe('pack-app command', () => {
let tempDir: string
beforeEach(() => {
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'pnpm-pack-app-test-'))
})
afterEach(() => {
fs.rmSync(tempDir, { recursive: true, force: true })
})
it('exposes the expected command name and shorthands', () => {
expect(commandNames).toEqual(['pack-app'])
expect(shorthands.t).toBe('--target')
expect(shorthands.o).toBe('--output-dir')
})
it('declares the user-facing CLI option types', () => {
const types = cliOptionsTypes()
expect(types.entry).toBe(String)
expect(types.target).toEqual([String, Array])
expect(types['node-version']).toBe(String)
expect(types['output-dir']).toBe(String)
expect(types['output-name']).toBe(String)
})
it('renders help text that lists the key options and supported targets', () => {
const text = help()
expect(text).toContain('Single Executable Application')
expect(text).toContain('--entry')
expect(text).toContain('--target')
expect(text).toContain('linux-x64')
expect(text).toContain('win32-arm64')
expect(text).toContain('linux-x64-musl')
})
function baseOpts (): Record<string, unknown> {
return {
dir: tempDir,
pnpmHomeDir: path.join(tempDir, 'pnpm-home'),
rawConfig: {},
}
}
it('fails fast when no --entry is provided', async () => {
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler(baseOpts() as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_MISSING_ENTRY' })
})
it('fails fast when the entry file does not exist', async () => {
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), entry: 'missing.cjs' } as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_ENTRY_NOT_FOUND' })
})
it('fails fast when the entry path is a directory', async () => {
fs.mkdirSync(path.join(tempDir, 'entry-dir'))
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), entry: 'entry-dir' } as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_ENTRY_NOT_FILE' })
})
it('reads entry from pnpm.app.entry when --entry is omitted', async () => {
fs.writeFileSync(path.join(tempDir, 'package.json'), JSON.stringify({
name: 'test-app',
pnpm: { app: { entry: 'from-config.cjs' } },
}))
fs.writeFileSync(path.join(tempDir, 'from-config.cjs'), 'module.exports = {}')
// With entry from config but no target, we hit MISSING_TARGET — that's
// enough to verify the entry was picked up from pnpm.app.entry.
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler(baseOpts() as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_MISSING_TARGET' })
})
it('reads targets from pnpm.app.targets when --target is omitted', async () => {
fs.writeFileSync(path.join(tempDir, 'package.json'), JSON.stringify({
name: 'test-app',
pnpm: { app: { targets: ['bad-target'] } },
}))
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
// A bad-target in the config should reach parseTarget and surface
// INVALID_TARGET — proves the config list was consulted.
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), entry: 'entry.cjs' } as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_INVALID_TARGET' })
})
it('CLI --target replaces pnpm.app.targets entirely (no merging)', async () => {
// Config says targets = [bad-target]. If the CLI list were merged in, the
// bad config entry would still hit parseTarget and throw INVALID_TARGET.
// With an unresolvable node version, validation passes but the later
// version lookup fails — we only assert that INVALID_TARGET never fires.
fs.writeFileSync(path.join(tempDir, 'package.json'), JSON.stringify({
name: 'test-app',
pnpm: { app: { entry: 'entry.cjs', targets: ['bad-target'] } },
}))
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), target: 'linux-x64', nodeVersion: '0.0.0-nonexistent-xxx' } as any, [])
).rejects.toMatchObject({ code: expect.not.stringMatching(/INVALID_TARGET/) })
})
it('rejects unknown keys in pnpm.app', async () => {
fs.writeFileSync(path.join(tempDir, 'package.json'), JSON.stringify({
name: 'test-app',
pnpm: { app: { entry: 'entry.cjs', bogus: 'yes' } },
}))
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler(baseOpts() as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_INVALID_CONFIG' })
})
it.each([
['entry as number', { entry: 42 }],
['targets as string', { targets: 'linux-x64' }],
['targets with non-string', { targets: ['linux-x64', 7] }],
['nodeVersion as array', { nodeVersion: ['25'] }],
])('rejects malformed pnpm.app: %s', async (_label, appConfig) => {
fs.writeFileSync(path.join(tempDir, 'package.json'), JSON.stringify({
name: 'test-app',
pnpm: { app: appConfig },
}))
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler(baseOpts() as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_INVALID_CONFIG' })
})
it('fails fast when no --target is provided', async () => {
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), entry: 'entry.cjs' } as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_MISSING_TARGET' })
})
it.each([
['unknown OS', 'freebsd-x64'],
['unknown arch', 'linux-mips'],
['unknown libc', 'linux-x64-gnu'],
['musl on non-linux', 'darwin-arm64-musl'],
['legacy macos alias', 'macos-arm64'],
['legacy win alias', 'win-x64'],
['incomplete', 'linux'],
['extra segment', 'linux-x64-musl-extra'],
['path traversal injected after musl', 'linux-x64-musl-../../pwn'],
['uppercase', 'LINUX-x64'],
['leading whitespace', ' linux-x64'],
])('rejects invalid target: %s (%s)', async (_label, target) => {
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
handler({ ...baseOpts(), entry: 'entry.cjs', target } as any, [])
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_INVALID_TARGET' })
})
it.each([
['with forward slash', 'sub/dir'],
['with backslash', 'sub\\\\dir'],
['dot dot', '..'],
['relative traversal', '../pwn'],
['absolute', '/tmp/pwn'],
['dot only', '.'],
['null byte', 'pwn\x00'],
['empty', ''],
['Windows reserved CON', 'CON'],
['Windows reserved nul.exe', 'nul.exe'],
['Windows reserved COM1', 'COM1'],
['Windows colon', 'my:tool'],
['Windows pipe', 'my|tool'],
['Windows question mark', 'my?tool'],
['Windows asterisk', 'my*tool'],
['Windows lt', 'my<tool'],
['Windows gt', 'my>tool'],
['Windows quote', 'my"tool'],
['trailing dot', 'tool.'],
['trailing space', 'tool '],
])('rejects invalid --output-name: %s (%j)', async (_label, outputName) => {
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
await expect(
handler(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
{ ...baseOpts(), entry: 'entry.cjs', target: 'linux-x64', outputName } as any,
[]
)
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_INVALID_OUTPUT_NAME' })
})
it('uses --output-name when set, instead of requiring a package.json', async () => {
fs.writeFileSync(path.join(tempDir, 'entry.cjs'), 'module.exports = {}')
// No package.json. We only want to validate that the code path reaches
// target parsing / output-name handling before any network call, so we
// assert on the error that surfaces when the target list is empty.
await expect(
handler(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
{ ...baseOpts(), entry: 'entry.cjs', outputName: 'explicit' } as any,
[]
)
).rejects.toMatchObject({ code: 'ERR_PNPM_PACK_APP_MISSING_TARGET' })
})
})

View File

@@ -60,9 +60,15 @@
{
"path": "../../engine/runtime/commands"
},
{
"path": "../../engine/runtime/node-resolver"
},
{
"path": "../../exec/lifecycle"
},
{
"path": "../../exec/pnpm-cli-runner"
},
{
"path": "../../fetching/directory-fetcher"
},

View File

@@ -4,6 +4,7 @@ import type {
PinnedVersion,
PkgResolutionId,
ProjectRootDir,
SupportedArchitectures,
TrustPolicy,
} from '@pnpm/types'
@@ -73,6 +74,65 @@ export interface VariationsResolution {
export type Resolution = AtomicResolution | VariationsResolution
/** Concrete platform selector used when picking a variant from a VariationsResolution. */
export interface PlatformSelector {
os: string
cpu: string
/** Name of the libc family requested. Omit (or leave `null`) for the default (glibc on Linux, n/a elsewhere). */
libc?: string | null
}
/**
* Resolve a {@link PlatformSelector} from the user's supportedArchitectures config
* and the host's own platform/arch/libc. When `supportedArchitectures.xxx` is set
* and its first entry is not `"current"`, that entry wins; otherwise the host's
* value is used. Additional entries beyond the first are ignored — variant
* selection picks exactly one (os, cpu, libc) triplet per install.
*/
export function resolvePlatformSelector (
supportedArchitectures: SupportedArchitectures | undefined,
host: { platform: string, arch: string, libc: string | null | undefined }
): PlatformSelector {
return {
os: pickFirstNonCurrent(supportedArchitectures?.os) ?? host.platform,
cpu: pickFirstNonCurrent(supportedArchitectures?.cpu) ?? host.arch,
libc: pickFirstNonCurrent(supportedArchitectures?.libc) ?? host.libc,
}
}
/**
* Pick the variant whose target matches the given selector, or `undefined` if
* none does. A variant with no `libc` represents the "default" build — glibc on
* Linux, irrelevant on macOS/Windows. A non-default libc (e.g. `musl`) is a
* separate, non-interchangeable artifact; an exact libc match is required in
* that case so the glibc/default variant doesn't silently win (its `target.libc`
* is nullish).
*/
export function selectPlatformVariant (
variants: PlatformAssetResolution[],
selector: PlatformSelector
): PlatformAssetResolution | undefined {
return variants.find((variant) => variant.targets.some((target) =>
target.os === selector.os &&
target.cpu === selector.cpu &&
libcMatches(target.libc, selector.libc)
))
}
function libcMatches (variantLibc: string | undefined, requestedLibc: string | null | undefined): boolean {
if (requestedLibc == null || requestedLibc === 'glibc') {
return variantLibc == null
}
return variantLibc === requestedLibc
}
function pickFirstNonCurrent (requirements: string[] | undefined): string | undefined {
if (requirements?.length && requirements[0] !== 'current') {
return requirements[0]
}
return undefined
}
export interface ResolveResult {
id: PkgResolutionId
latest?: string