feat(llm-security): OS sandbox for /security ide-scan <url> (v6.5.0)
VSIX fetch + extract for URL targets now runs in a sub-process wrapped by
sandbox-exec (macOS) or bwrap (Linux), reusing the same primitives proven
by the v5.1 git-clone sandbox. Defense-in-depth — even if our own
zip-extract.mjs ever has a bypass, the kernel refuses any write outside
the per-scan temp directory.
New files:
- scanners/lib/vsix-fetch-worker.mjs — sub-process worker. Argv: --url
--tmpdir; emits one JSON line on stdout (ok/sha256/size/source/extRoot
or ok:false/error/code). Silent on stderr. Exit 0/1.
- scanners/lib/vsix-sandbox.mjs — wrapper. Exports buildSandboxProfile,
buildBwrapArgs, buildSandboxedWorker, runVsixWorker. 35s timeout, 1 MB
stdout cap.
Changes:
- scanners/ide-extension-scanner.mjs: fetchAndExtractVsixUrl is now
sandbox-aware (useSandbox option, default true). In-process logic
preserved as fallback. New meta.source.sandbox field:
'sandbox-exec' | 'bwrap' | 'none' | 'in-process'.
- scan(target, { useSandbox }) defaults to true; tests pass false because
globalThis.fetch mocks do not cross process boundaries.
- Windows fallback: in-process with meta.warnings advisory.
Tests:
- 8 new tests in tests/scanners/vsix-sandbox.test.mjs (per-platform
profile generation, worker arg construction, live worker exit
behavior on invalid URLs — no network).
- Existing URL tests updated to opt out of sandbox (useSandbox: false).
- 1344 → 1352 tests, all green.
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
parent
fe0193956d
commit
9f893c3858
11 changed files with 434 additions and 24 deletions
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "llm-security",
|
||||
"description": "Security scanning, auditing, and threat modeling for Claude Code projects. Detects secrets, validates MCP servers, assesses security posture, and generates threat models aligned with OWASP LLM Top 10.",
|
||||
"version": "6.4.0"
|
||||
"version": "6.5.0"
|
||||
}
|
||||
|
|
|
|||
|
|
@ -4,6 +4,24 @@ All notable changes to the LLM Security Plugin are documented in this file.
|
|||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [6.5.0] - 2026-04-17
|
||||
|
||||
### Added
|
||||
- **OS sandbox for `/security ide-scan <url>`.** VSIX fetch + extract now runs in a sub-process wrapped by `sandbox-exec` (macOS) or `bwrap` (Linux), reusing the same primitives proven by the `git clone` sandbox introduced in v5.1. Defense-in-depth: even if `zip-extract.mjs` has an undiscovered bypass, the kernel refuses any write outside the per-scan temp directory
|
||||
- **`scanners/lib/vsix-fetch-worker.mjs`** — Sub-process worker. Argv: `--url <url> --tmpdir <writable-dir>`. Emits a single JSON line on stdout (`{ok, sha256, size, finalUrl, source, extRoot}` or `{ok:false, error, code?}`). Exit 0 on success, 1 on failure. Silent on stderr
|
||||
- **`scanners/lib/vsix-sandbox.mjs`** — Wrapper. Exports `buildSandboxProfile`, `buildBwrapArgs`, `buildSandboxedWorker(tmpDir, args)`, `runVsixWorker(url, tmpDir, opts)`. 35 s timeout, 1 MB stdout cap, deterministic JSON-line protocol
|
||||
- **`scan(url, { useSandbox })` option.** Default `true` for CLI invocations; tests pass `false` to keep `globalThis.fetch` mocking working (mocks do not cross process boundaries). When sandbox unavailable on the platform (e.g., Windows), a warning is added to `meta.warnings` and the scan still completes via the in-process fallback
|
||||
- **`meta.source.sandbox`** — New envelope field: `'sandbox-exec' | 'bwrap' | 'none' | 'in-process'`. Tells the report which protection layer was actually active
|
||||
- **8 new tests** in `tests/scanners/vsix-sandbox.test.mjs` covering profile generation per platform, worker arg construction, and live worker exit behavior on invalid URLs (no network required)
|
||||
|
||||
### Changed
|
||||
- `fetchAndExtractVsixUrl` in `ide-extension-scanner.mjs` is now sandbox-aware (`useSandbox` option, default `true`). Existing in-process logic preserved as fallback path
|
||||
- Version bump: 6.4.0 → 6.5.0 across all files
|
||||
|
||||
### Why
|
||||
- Aligns the IDE-scan URL pipeline with the same defense-in-depth posture as the GitHub clone pipeline — kernel-enforced FS confinement instead of in-process validation alone
|
||||
- VSIX is untrusted bytes from a third-party registry; even with hardened parsing, an OS sandbox is the right blast-radius constraint for filesystem writes
|
||||
|
||||
## [6.4.0] - 2026-04-17
|
||||
|
||||
### Added
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
# LLM Security Plugin (v6.4.0)
|
||||
# LLM Security Plugin (v6.5.0)
|
||||
|
||||
Security scanning, auditing, and threat modeling for Claude Code projects. 5 frameworks: OWASP LLM Top 10, Agentic AI Top 10 (ASI), Skills Top 10 (AST), MCP Top 10, AI Agent Traps (DeepMind). 1344 tests.
|
||||
Security scanning, auditing, and threat modeling for Claude Code projects. 5 frameworks: OWASP LLM Top 10, Agentic AI Top 10 (ASI), Skills Top 10 (AST), MCP Top 10, AI Agent Traps (DeepMind). 1352 tests.
|
||||
|
||||
## Commands
|
||||
|
||||
|
|
@ -98,6 +98,8 @@ Scanner prefix: MCI. OWASP: MCP03, MCP06, MCP09. Invoked by `mcp-inspect` and `m
|
|||
|
||||
**v6.4.0 — URL support.** Targets can be Marketplace, OpenVSX, or direct `.vsix` URLs. Pipeline: `lib/vsix-fetch.mjs` (HTTPS-only fetch with 50MB cap, 30s timeout, SHA-256, manual redirect host whitelist) → `lib/zip-extract.mjs` (zero-dep ZIP parser, rejects zip-slip/symlink/absolute/drive-letter/encrypted/ZIP64, caps: 10 000 entries, 500MB uncomp, 100x ratio, depth 20) → existing scan pipeline against extracted `extension/` subdir → temp dir always cleaned in `try/finally`. Envelope.meta.source = `{ type: "url", kind, url, finalUrl, sha256, size, publisher?, name?, version? }`.
|
||||
|
||||
**v6.5.0 — OS sandbox.** Fetch + extract for URL targets now spawns `lib/vsix-fetch-worker.mjs` in a sub-process wrapped by `sandbox-exec` (macOS) or `bwrap` (Linux) — same primitives reused from `git-clone.mjs`. Helper: `lib/vsix-sandbox.mjs` exports `buildSandboxProfile`, `buildBwrapArgs`, `buildSandboxedWorker`, `runVsixWorker`. Worker IPC: argv `--url <url> --tmpdir <dir>` → single JSON line on stdout (`{ok, sha256, size, finalUrl, source, extRoot}` or `{ok:false, error, code?}`). Defense-in-depth — if the in-process ZIP parser ever has a bypass, the kernel still refuses writes outside `<tmpdir>`. `scan(target, { useSandbox })` defaults to `true`; tests pass `false` since `globalThis.fetch` mocks do not cross process boundaries. Windows fallback: in-process with `meta.warnings` advisory. Envelope `meta.source.sandbox`: `'sandbox-exec' | 'bwrap' | 'none' | 'in-process'`.
|
||||
|
||||
Run: `node scanners/ide-extension-scanner.mjs [target|url] [--vscode-only] [--intellij-only] [--include-builtin] [--online] [--format json|compact] [--fail-on <sev>] [--output-file <path>]`. Invoked by `/security ide-scan`.
|
||||
|
||||
## Token Budget (ENFORCED)
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@
|
|||
|
||||
*Built for my own Claude Code workflow and shared openly for anyone who finds it useful. This is a solo project — bug reports and feature requests are welcome, but pull requests are not accepted.*
|
||||
|
||||

|
||||

|
||||

|
||||

|
||||

|
||||
|
|
@ -822,6 +822,7 @@ This plugin provides full-stack security hardening (static analysis + supply cha
|
|||
|
||||
| Version | Date | Highlights |
|
||||
|---------|------|------------|
|
||||
| **6.5.0** | 2026-04-17 | **OS sandbox for `/security ide-scan <url>`.** VSIX fetch + extract now runs in a sub-process wrapped by `sandbox-exec` (macOS) or `bwrap` (Linux), reusing the same primitives proven by the v5.1 git-clone sandbox. Defense-in-depth — even if `lib/zip-extract.mjs` ever has a bypass, the kernel refuses any write outside the per-scan temp directory. New: `lib/vsix-fetch-worker.mjs` (sub-process worker with deterministic JSON-line IPC) and `lib/vsix-sandbox.mjs` (`buildSandboxProfile` / `buildBwrapArgs` / `buildSandboxedWorker` / `runVsixWorker`, 35 s timeout, 1 MB stdout cap). New `scan(target, { useSandbox })` option (default `true` for CLI; tests use `false` since `globalThis.fetch` mocks do not cross processes). Windows fallback: in-process with `meta.warnings` advisory. Envelope `meta.source.sandbox` field: `'sandbox-exec' \| 'bwrap' \| 'none' \| 'in-process'`. 1352 tests (was 1344). |
|
||||
| **6.4.0** | 2026-04-17 | **`/security ide-scan <url>` — pre-install verification.** The IDE extension scanner now accepts URLs and fetches the VSIX before scanning. Supported: VS Code Marketplace (`https://marketplace.visualstudio.com/items?itemName=publisher.name`), OpenVSX (`https://open-vsx.org/extension/publisher/name[/version]`), and direct `.vsix` URLs. New libraries: `lib/vsix-fetch.mjs` (HTTPS-only fetch with 50MB cap, 30s timeout, SHA-256, manual host-whitelisted redirects) and `lib/zip-extract.mjs` (zero-dep ZIP parser, rejects zip-slip / symlinks / absolute paths / drive letters / encrypted entries / ZIP64; caps: 10 000 entries, 500MB uncompressed, 100x expansion ratio, depth 20). Temp dir always cleaned in `try/finally`. Envelope `meta.source` carries `{ type: "url", kind, url, finalUrl, sha256, size, publisher, name, version }`. New knowledge file: `marketplace-api-notes.md`. GitHub repo URLs intentionally not supported (would require a build step). 1344 tests (was 1296). |
|
||||
| **6.3.0** | 2026-04-17 | **IDE extension prescan.** New `/security ide-scan` command and `ide-extension-scanner.mjs` (prefix IDE) discover and audit installed VS Code extensions (and forks: Cursor, Windsurf, VSCodium, code-server, Insiders, Remote-SSH; JetBrains is a v1.1 stub). 7 IDE-specific checks: blocklist match, theme-with-code, sideload (`.vsix`), broad activation (`*`, `onStartupFinished`), Levenshtein typosquat ≤2 vs top-100, extension-pack expansion, dangerous `vscode:uninstall` hooks. Per-extension orchestration of UNI/ENT/NET/TNT/MEM/SCR scanners with bounded concurrency. OS-aware discovery via `lib/ide-extension-discovery.mjs` (Platform-specific suffix parsing for `darwin-x64`, `linux-arm64`, etc.). Offline-first; `--online` opt-in for future Marketplace/OSV.dev lookups. New knowledge files: `ide-extension-threat-patterns.md` (10 categories, 2024-2026 case studies from Koi Security — GlassWorm, WhiteCobra, TigerJack, Material Theme), `top-vscode-extensions.json` (typosquat seed + blocklist), `top-jetbrains-plugins.json` (stub). 1296 tests (was 1274). |
|
||||
| **6.2.0** | 2026-04-17 | **Opus 4.7 + Claude Code 2.1.112 alignment.** Bash-normalize extended with T5 (`${IFS}` word-splitting) and T6 (ANSI-C `$'\xHH'` hex quoting) layers. New `pre-compact-scan.mjs` PreCompact hook — scans transcript tail (500 KB cap, <500 ms) for injection + credentials before context compaction. Modes: `block` / `warn` / `off` via `LLM_SECURITY_PRECOMPACT_MODE`. Agent files reframed for Opus 4.7's more literal instruction-following (Step 0 generaliseringsgrense + parallell Read-hint in skill-scanner + mcp-scanner). New `docs/security-hardening-guide.md` with env-var reference, sandboxing notes, system-card §5.2.1 / §6.3.1.1 mapping. CLAUDE.md Defense Philosophy links to system card. 1274 tests (was 1264). |
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
{
|
||||
"name": "llm-security",
|
||||
"version": "6.4.0",
|
||||
"version": "6.5.0",
|
||||
"description": "Security scanning, auditing, and threat modeling for Claude Code projects",
|
||||
"type": "module",
|
||||
"bin": {
|
||||
|
|
|
|||
|
|
@ -29,6 +29,7 @@ import { parseVSCodeExtension, parseVsixFile } from './lib/ide-extension-parser.
|
|||
import { loadTopVSCode, loadVSCodeBlocklist, normalizeId } from './lib/ide-extension-data.mjs';
|
||||
import { fetchVsixFromUrl, detectUrlType } from './lib/vsix-fetch.mjs';
|
||||
import { extractToDir, ZipError } from './lib/zip-extract.mjs';
|
||||
import { runVsixWorker } from './lib/vsix-sandbox.mjs';
|
||||
|
||||
import { scan as scanUnicode } from './unicode-scanner.mjs';
|
||||
import { scan as scanEntropy } from './entropy-scanner.mjs';
|
||||
|
|
@ -37,7 +38,7 @@ import { scan as scanTaint } from './taint-tracer.mjs';
|
|||
import { scan as scanMemoryPoisoning } from './memory-poisoning-scanner.mjs';
|
||||
import { scan as scanSupplyChain } from './supply-chain-recheck.mjs';
|
||||
|
||||
const VERSION = '6.4.0';
|
||||
const VERSION = '6.5.0';
|
||||
const SCANNER = 'IDE';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
|
|
@ -53,14 +54,44 @@ function isUrlTarget(target) {
|
|||
* `parseVSCodeExtension` should be pointed at. VSIX layout always nests the
|
||||
* extension under `extension/`.
|
||||
*
|
||||
* Two modes:
|
||||
* - useSandbox=true (default for CLI): spawns vsix-fetch-worker.mjs under
|
||||
* sandbox-exec (macOS) / bwrap (Linux) so any FS write is restricted to
|
||||
* <tempDir>. Defense-in-depth against zip-extract bugs.
|
||||
* - useSandbox=false: runs fetch + extract in-process. Used by tests that
|
||||
* mock globalThis.fetch (mocking does not cross process boundaries).
|
||||
*
|
||||
* Caller MUST `await rm(result.tempDir, { recursive: true, force: true })` in finally.
|
||||
*
|
||||
* @param {string} url
|
||||
* @returns {Promise<{ extRoot: string, tempDir: string, source: object }>}
|
||||
* @param {{ useSandbox?: boolean }} [opts]
|
||||
* @returns {Promise<{ extRoot: string, tempDir: string, source: object, sandbox: 'sandbox-exec'|'bwrap'|null|'in-process' }>}
|
||||
*/
|
||||
async function fetchAndExtractVsixUrl(url) {
|
||||
async function fetchAndExtractVsixUrl(url, opts = {}) {
|
||||
const useSandbox = opts.useSandbox !== false;
|
||||
const tempDir = await mkdtemp(join(tmpdir(), 'llm-sec-vsix-'));
|
||||
try {
|
||||
if (useSandbox) {
|
||||
const { ok, sandbox, payload } = await runVsixWorker(url, tempDir);
|
||||
if (!ok) {
|
||||
const msg = payload && payload.error ? payload.error : 'worker failed';
|
||||
throw new Error(msg);
|
||||
}
|
||||
const { type: kind, ...sourceMeta } = payload.source;
|
||||
const source = {
|
||||
type: 'url',
|
||||
kind,
|
||||
url,
|
||||
finalUrl: payload.finalUrl,
|
||||
sha256: payload.sha256,
|
||||
size: payload.size,
|
||||
sandbox: sandbox || 'none',
|
||||
...sourceMeta,
|
||||
};
|
||||
return { extRoot: payload.extRoot, tempDir, source, sandbox: sandbox || null };
|
||||
}
|
||||
|
||||
// In-process path (tests, or fallback when caller wants no sub-process).
|
||||
let fetched;
|
||||
try {
|
||||
fetched = await fetchVsixFromUrl(url);
|
||||
|
|
@ -75,23 +106,21 @@ async function fetchAndExtractVsixUrl(url) {
|
|||
}
|
||||
throw err;
|
||||
}
|
||||
// VSIX nests files under `extension/`. If that doesn't exist, fall back to
|
||||
// the temp dir itself (some packagers omit the wrapper).
|
||||
const nested = join(tempDir, 'extension');
|
||||
const extRoot = existsSync(nested) ? nested : tempDir;
|
||||
const { type: kind, ...sourceMeta } = fetched.source;
|
||||
const source = {
|
||||
type: 'url',
|
||||
kind, // 'marketplace' | 'openvsx' | 'vsix'
|
||||
kind,
|
||||
url,
|
||||
finalUrl: fetched.finalUrl,
|
||||
sha256: fetched.sha256,
|
||||
size: fetched.size,
|
||||
sandbox: 'in-process',
|
||||
...sourceMeta,
|
||||
};
|
||||
return { extRoot, tempDir, source };
|
||||
return { extRoot, tempDir, source, sandbox: 'in-process' };
|
||||
} catch (err) {
|
||||
// Cleanup on error before propagating.
|
||||
await rm(tempDir, { recursive: true, force: true }).catch(() => {});
|
||||
throw err;
|
||||
}
|
||||
|
|
@ -459,10 +488,13 @@ export async function scan(target, options = {}) {
|
|||
warnings.push('GitHub repo URLs are not supported in v6.4.0 — would require build step. Use the Marketplace, OpenVSX, or a direct .vsix link.');
|
||||
} else {
|
||||
try {
|
||||
const fetched = await fetchAndExtractVsixUrl(target);
|
||||
const fetched = await fetchAndExtractVsixUrl(target, { useSandbox: options.useSandbox });
|
||||
urlSource = fetched.source;
|
||||
urlTempDir = fetched.tempDir;
|
||||
target = fetched.extRoot; // forward into single-target path mode
|
||||
if (fetched.sandbox === null && options.useSandbox !== false) {
|
||||
warnings.push('OS sandbox unavailable on this platform — VSIX extracted without sandbox-exec/bwrap. Defense-in-depth reduced to in-process zip-extract validation.');
|
||||
}
|
||||
} catch (err) {
|
||||
warnings.push(`URL fetch/extract failed: ${err.message}`);
|
||||
}
|
||||
|
|
|
|||
76
plugins/llm-security/scanners/lib/vsix-fetch-worker.mjs
Normal file
76
plugins/llm-security/scanners/lib/vsix-fetch-worker.mjs
Normal file
|
|
@ -0,0 +1,76 @@
|
|||
#!/usr/bin/env node
|
||||
// vsix-fetch-worker.mjs — Sub-process worker that fetches a VSIX URL and extracts
|
||||
// it to a writable directory. Designed to be spawned under sandbox-exec (macOS),
|
||||
// bwrap (Linux), or directly (Windows fallback).
|
||||
//
|
||||
// Contract:
|
||||
// stdin: none
|
||||
// argv: --url <url> --tmpdir <writable-dir>
|
||||
// stdout: single JSON line {ok:true, sha256, size, finalUrl, source, extRoot}
|
||||
// on success, or {ok:false, error:"<msg>", code?:"<ZIP_CODE>"} on failure
|
||||
// stderr: never (silent — all errors via JSON on stdout)
|
||||
// exit: 0 on success, 1 on any failure (caller still parses stdout)
|
||||
//
|
||||
// Why a worker: the parent process can wrap this command in sandbox-exec / bwrap
|
||||
// so any filesystem write the ZIP extractor performs is restricted to <tmpdir>.
|
||||
// Defense-in-depth — even if our own zip-slip / symlink validation has a bug,
|
||||
// the OS sandbox cannot let bytes land outside <tmpdir>.
|
||||
|
||||
import { writeFileSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { fetchVsixFromUrl } from './vsix-fetch.mjs';
|
||||
import { extractToDir, ZipError } from './zip-extract.mjs';
|
||||
|
||||
function emit(obj) {
|
||||
process.stdout.write(JSON.stringify(obj) + '\n');
|
||||
}
|
||||
|
||||
function parseArgs(argv) {
|
||||
const out = { url: null, tmpdir: null };
|
||||
for (let i = 0; i < argv.length; i++) {
|
||||
if (argv[i] === '--url' && i + 1 < argv.length) out.url = argv[++i];
|
||||
else if (argv[i] === '--tmpdir' && i + 1 < argv.length) out.tmpdir = argv[++i];
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const { url, tmpdir: dir } = parseArgs(process.argv.slice(2));
|
||||
if (!url || !dir) {
|
||||
emit({ ok: false, error: 'missing --url or --tmpdir' });
|
||||
process.exit(1);
|
||||
}
|
||||
let fetched;
|
||||
try {
|
||||
fetched = await fetchVsixFromUrl(url);
|
||||
} catch (err) {
|
||||
emit({ ok: false, error: `fetch failed: ${err.message}` });
|
||||
process.exit(1);
|
||||
}
|
||||
try {
|
||||
await extractToDir(fetched.buffer, dir);
|
||||
} catch (err) {
|
||||
if (err instanceof ZipError) {
|
||||
emit({ ok: false, error: `malformed VSIX (${err.code}): ${err.message}`, code: err.code });
|
||||
} else {
|
||||
emit({ ok: false, error: `extract failed: ${err.message}` });
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
const nested = join(dir, 'extension');
|
||||
const extRoot = existsSync(nested) ? nested : dir;
|
||||
emit({
|
||||
ok: true,
|
||||
sha256: fetched.sha256,
|
||||
size: fetched.size,
|
||||
finalUrl: fetched.finalUrl,
|
||||
source: fetched.source,
|
||||
extRoot,
|
||||
});
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
emit({ ok: false, error: `worker crashed: ${err.message || String(err)}` });
|
||||
process.exit(1);
|
||||
});
|
||||
169
plugins/llm-security/scanners/lib/vsix-sandbox.mjs
Normal file
169
plugins/llm-security/scanners/lib/vsix-sandbox.mjs
Normal file
|
|
@ -0,0 +1,169 @@
|
|||
// vsix-sandbox.mjs — Spawn vsix-fetch-worker.mjs under an OS sandbox so any
|
||||
// filesystem writes the ZIP extractor performs are restricted to a single
|
||||
// temp directory.
|
||||
//
|
||||
// Reuses the same sandbox primitives proven by `git-clone.mjs`:
|
||||
// - macOS: sandbox-exec with a deny-file-write profile + subpath allow
|
||||
// - Linux: bwrap with --ro-bind / --bind / --unshare-all
|
||||
// - Windows / fallback: spawn directly + WARN to stderr (no OS sandbox)
|
||||
//
|
||||
// Defense-in-depth: even if our own zip-extract.mjs has a zip-slip / symlink
|
||||
// bypass we did not foresee, the OS will refuse the write. The in-process
|
||||
// validation in zip-extract.mjs remains the first line of defense.
|
||||
|
||||
import { spawn } from 'node:child_process';
|
||||
import { spawnSync } from 'node:child_process';
|
||||
import { realpathSync } from 'node:fs';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { dirname, resolve as resolvePath } from 'node:path';
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
const WORKER_PATH = resolvePath(__dirname, 'vsix-fetch-worker.mjs');
|
||||
const WORKER_TIMEOUT_MS = 35_000; // fetch is 30s, give worker 5s of slack
|
||||
const MAX_OUTPUT_BYTES = 1024 * 1024; // 1MB JSON cap (output is tiny in practice)
|
||||
|
||||
/**
|
||||
* Build the sandbox-exec profile that allows writes only to `allowedWritePath`.
|
||||
* Mirrors `buildSandboxProfile` in git-clone.mjs.
|
||||
* @returns {string|null} null if not on macOS or sandbox-exec missing
|
||||
*/
|
||||
export function buildSandboxProfile(allowedWritePath) {
|
||||
if (process.platform !== 'darwin') return null;
|
||||
const check = spawnSync('which', ['sandbox-exec'], { encoding: 'utf8' });
|
||||
if (check.status !== 0) return null;
|
||||
const realPath = realpathSync(allowedWritePath);
|
||||
return [
|
||||
'(version 1)',
|
||||
'(allow default)',
|
||||
'(deny file-write*)',
|
||||
`(allow file-write* (subpath "${realPath}"))`,
|
||||
'(allow file-write* (literal "/dev/null"))',
|
||||
'(allow file-write* (literal "/dev/tty"))',
|
||||
].join('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Build bwrap arguments that allow writes only to `allowedWritePath`.
|
||||
* Mirrors `buildBwrapArgs` in git-clone.mjs.
|
||||
* @returns {string[]|null} null if not on Linux or bwrap unusable
|
||||
*/
|
||||
export function buildBwrapArgs(allowedWritePath, innerArgs) {
|
||||
if (process.platform !== 'linux') return null;
|
||||
const check = spawnSync('which', ['bwrap'], { encoding: 'utf8' });
|
||||
if (check.status !== 0) return null;
|
||||
// Probe — bwrap is shipped on Ubuntu 24.04+ but may need admin AppArmor config.
|
||||
const probe = spawnSync(
|
||||
'bwrap',
|
||||
['--ro-bind', '/', '/', '--dev', '/dev', '/bin/true'],
|
||||
{ stdio: 'ignore', timeout: 5000 },
|
||||
);
|
||||
if (probe.status !== 0) return null;
|
||||
return [
|
||||
'--ro-bind', '/', '/',
|
||||
'--bind', allowedWritePath, allowedWritePath,
|
||||
'--dev', '/dev',
|
||||
'--unshare-all',
|
||||
'--new-session',
|
||||
'--die-with-parent',
|
||||
...innerArgs,
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve the spawn command for a worker invocation. Returns the OS sandbox
|
||||
* if available; otherwise plain `node` with a `sandbox: null` flag so the
|
||||
* caller can WARN. Identical pattern to `buildSandboxedClone`.
|
||||
*
|
||||
* @param {string} tmpDir writable temp dir for the worker
|
||||
* @param {string[]} workerArgs argv for the worker (after `node <worker>`)
|
||||
* @returns {{cmd:string, args:string[], sandbox: 'sandbox-exec'|'bwrap'|null}}
|
||||
*/
|
||||
export function buildSandboxedWorker(tmpDir, workerArgs) {
|
||||
const innerArgs = ['node', WORKER_PATH, ...workerArgs];
|
||||
|
||||
const profile = buildSandboxProfile(tmpDir);
|
||||
if (profile) {
|
||||
return { cmd: 'sandbox-exec', args: ['-p', profile, ...innerArgs], sandbox: 'sandbox-exec' };
|
||||
}
|
||||
|
||||
const bwrapArgs = buildBwrapArgs(tmpDir, innerArgs);
|
||||
if (bwrapArgs) {
|
||||
return { cmd: 'bwrap', args: bwrapArgs, sandbox: 'bwrap' };
|
||||
}
|
||||
|
||||
// Fallback — Windows or sandbox tools missing. Worker still runs, just not boxed.
|
||||
return { cmd: innerArgs[0], args: innerArgs.slice(1), sandbox: null };
|
||||
}
|
||||
|
||||
/**
|
||||
* Spawn the VSIX worker and parse its single JSON line of output.
|
||||
*
|
||||
* @param {string} url URL to fetch (passed via argv to worker)
|
||||
* @param {string} tmpDir writable directory for extracted files
|
||||
* @param {object} [opts]
|
||||
* @param {boolean} [opts.allowFallback=true] if false, throw when no OS sandbox
|
||||
* @returns {Promise<{ok:boolean, sandbox:'sandbox-exec'|'bwrap'|null, payload:object}>}
|
||||
*/
|
||||
export function runVsixWorker(url, tmpDir, opts = {}) {
|
||||
const { allowFallback = true } = opts;
|
||||
const { cmd, args, sandbox } = buildSandboxedWorker(tmpDir, ['--url', url, '--tmpdir', tmpDir]);
|
||||
|
||||
if (!sandbox && !allowFallback) {
|
||||
return Promise.reject(new Error('no OS sandbox available and fallback disabled'));
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const child = spawn(cmd, args, {
|
||||
stdio: ['ignore', 'pipe', 'pipe'],
|
||||
timeout: WORKER_TIMEOUT_MS,
|
||||
env: { ...process.env },
|
||||
});
|
||||
|
||||
let stdout = '';
|
||||
let stdoutBytes = 0;
|
||||
let truncated = false;
|
||||
child.stdout.on('data', (chunk) => {
|
||||
stdoutBytes += chunk.length;
|
||||
if (stdoutBytes > MAX_OUTPUT_BYTES) {
|
||||
truncated = true;
|
||||
try { child.kill('SIGKILL'); } catch {}
|
||||
return;
|
||||
}
|
||||
stdout += chunk.toString('utf8');
|
||||
});
|
||||
|
||||
let stderr = '';
|
||||
child.stderr.on('data', (chunk) => { stderr += chunk.toString('utf8'); });
|
||||
|
||||
child.on('error', (err) => reject(new Error(`worker spawn failed: ${err.message}`)));
|
||||
|
||||
child.on('close', (code, signal) => {
|
||||
if (truncated) {
|
||||
return reject(new Error('worker output exceeded 1MB cap'));
|
||||
}
|
||||
if (signal === 'SIGTERM' && code === null) {
|
||||
return reject(new Error(`worker timed out after ${WORKER_TIMEOUT_MS}ms`));
|
||||
}
|
||||
// Parse the last non-empty line as JSON (worker writes one line on success/failure).
|
||||
const lines = stdout.split('\n').map((l) => l.trim()).filter(Boolean);
|
||||
const last = lines[lines.length - 1];
|
||||
if (!last) {
|
||||
const tail = stderr.trim().slice(0, 200);
|
||||
return reject(new Error(`worker produced no output${tail ? ` (stderr: ${tail})` : ''}`));
|
||||
}
|
||||
let payload;
|
||||
try {
|
||||
payload = JSON.parse(last);
|
||||
} catch {
|
||||
return reject(new Error(`worker emitted non-JSON: ${last.slice(0, 120)}`));
|
||||
}
|
||||
resolve({ ok: payload.ok === true, sandbox, payload });
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
export const __testing = {
|
||||
WORKER_PATH,
|
||||
WORKER_TIMEOUT_MS,
|
||||
MAX_OUTPUT_BYTES,
|
||||
};
|
||||
|
|
@ -54,7 +54,7 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
|
||||
it('rejects unsupported URL with a warning, no extensions scanned', async () => {
|
||||
installFetchRouter(() => null);
|
||||
const env = await scan('https://example.com/random.zip', { vscodeOnly: true });
|
||||
const env = await scan('https://example.com/random.zip', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /unsupported URL/i.test(w)));
|
||||
assert.equal(env.meta.source, null);
|
||||
|
|
@ -62,7 +62,7 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
|
||||
it('reports github URL as unsupported in v6.4.0', async () => {
|
||||
installFetchRouter(() => null);
|
||||
const env = await scan('https://github.com/anthropic/claude-code', { vscodeOnly: true });
|
||||
const env = await scan('https://github.com/anthropic/claude-code', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /GitHub repo URLs/i.test(w)));
|
||||
});
|
||||
|
|
@ -83,7 +83,7 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
return null;
|
||||
});
|
||||
|
||||
const env = await scan('https://open-vsx.org/extension/anthropic/claude-code', { vscodeOnly: true });
|
||||
const env = await scan('https://open-vsx.org/extension/anthropic/claude-code', { vscodeOnly: true, useSandbox: false });
|
||||
assert.ok(metaCalled, 'expected metadata fetch for latest version');
|
||||
assert.ok(downloadCalled, 'expected VSIX download');
|
||||
assert.equal(env.extensions.length, 1);
|
||||
|
|
@ -109,7 +109,7 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
return null;
|
||||
});
|
||||
|
||||
const env = await scan('https://marketplace.visualstudio.com/items?itemName=anthropic.claude-code', { vscodeOnly: true });
|
||||
const env = await scan('https://marketplace.visualstudio.com/items?itemName=anthropic.claude-code', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(downloads, 1);
|
||||
assert.equal(env.extensions.length, 1);
|
||||
assert.equal(env.extensions[0].id, 'anthropic.claude-code');
|
||||
|
|
@ -120,7 +120,7 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
it('cleans up temp dir even when extraction fails', async () => {
|
||||
// Return a non-zip body so extract throws.
|
||||
installFetchRouter(() => mockResponse(Buffer.from('not a zip at all')));
|
||||
const env = await scan('https://example.com/bad.vsix', { vscodeOnly: true });
|
||||
const env = await scan('https://example.com/bad.vsix', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /malformed VSIX/.test(w)));
|
||||
});
|
||||
|
|
@ -131,14 +131,14 @@ describe('ide-extension-scanner — URL mode', () => {
|
|||
{ name: '../escape.txt', data: 'pwned' },
|
||||
]);
|
||||
installFetchRouter(() => mockResponse(evil));
|
||||
const env = await scan('https://example.com/evil.vsix', { vscodeOnly: true });
|
||||
const env = await scan('https://example.com/evil.vsix', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /malformed VSIX/.test(w) && /traversal/.test(w)));
|
||||
});
|
||||
|
||||
it('handles fetch network failure cleanly', async () => {
|
||||
installFetchRouter(() => { throw new Error('ECONNREFUSED'); });
|
||||
const env = await scan('https://open-vsx.org/extension/foo/bar', { vscodeOnly: true });
|
||||
const env = await scan('https://open-vsx.org/extension/foo/bar', { vscodeOnly: true, useSandbox: false });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /URL fetch\/extract failed/.test(w)));
|
||||
});
|
||||
|
|
|
|||
112
plugins/llm-security/tests/scanners/vsix-sandbox.test.mjs
Normal file
112
plugins/llm-security/tests/scanners/vsix-sandbox.test.mjs
Normal file
|
|
@ -0,0 +1,112 @@
|
|||
// vsix-sandbox.test.mjs — Tests for the VSIX sandbox wrapper and worker.
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import { mkdtemp, rm } from 'node:fs/promises';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { join } from 'node:path';
|
||||
import { spawnSync } from 'node:child_process';
|
||||
import {
|
||||
buildSandboxProfile,
|
||||
buildBwrapArgs,
|
||||
buildSandboxedWorker,
|
||||
runVsixWorker,
|
||||
__testing,
|
||||
} from '../../scanners/lib/vsix-sandbox.mjs';
|
||||
|
||||
describe('vsix-sandbox — buildSandboxProfile', () => {
|
||||
it('returns null on non-darwin', () => {
|
||||
if (process.platform === 'darwin') return; // Not applicable here.
|
||||
const profile = buildSandboxProfile('/tmp');
|
||||
assert.equal(profile, null);
|
||||
});
|
||||
|
||||
it('returns a valid profile string on macOS when sandbox-exec exists', () => {
|
||||
if (process.platform !== 'darwin') return;
|
||||
const has = spawnSync('which', ['sandbox-exec'], { encoding: 'utf8' });
|
||||
if (has.status !== 0) return;
|
||||
const profile = buildSandboxProfile('/tmp');
|
||||
assert.ok(profile, 'expected profile string on macOS');
|
||||
assert.match(profile, /\(version 1\)/);
|
||||
assert.match(profile, /\(deny file-write\*\)/);
|
||||
assert.match(profile, /\(allow file-write\* \(subpath /);
|
||||
});
|
||||
});
|
||||
|
||||
describe('vsix-sandbox — buildBwrapArgs', () => {
|
||||
it('returns null on non-linux', () => {
|
||||
if (process.platform === 'linux') return;
|
||||
const args = buildBwrapArgs('/tmp', ['/bin/true']);
|
||||
assert.equal(args, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe('vsix-sandbox — buildSandboxedWorker', () => {
|
||||
it('returns sandbox-exec on macOS, bwrap on Linux, or null fallback', () => {
|
||||
const { cmd, args, sandbox } = buildSandboxedWorker('/tmp', ['--url', 'https://x', '--tmpdir', '/tmp']);
|
||||
assert.ok(cmd);
|
||||
assert.ok(Array.isArray(args));
|
||||
if (process.platform === 'darwin') {
|
||||
const has = spawnSync('which', ['sandbox-exec'], { encoding: 'utf8' });
|
||||
if (has.status === 0) {
|
||||
assert.equal(sandbox, 'sandbox-exec');
|
||||
assert.equal(cmd, 'sandbox-exec');
|
||||
assert.equal(args[0], '-p');
|
||||
}
|
||||
} else if (process.platform === 'linux') {
|
||||
// Could be 'bwrap' or null depending on availability — both are valid.
|
||||
assert.ok(sandbox === 'bwrap' || sandbox === null);
|
||||
} else {
|
||||
assert.equal(sandbox, null);
|
||||
assert.equal(cmd, 'node');
|
||||
}
|
||||
});
|
||||
|
||||
it('always includes the worker path and forwarded args', () => {
|
||||
const { args } = buildSandboxedWorker('/tmp', ['--url', 'https://example/', '--tmpdir', '/tmp']);
|
||||
const joined = args.join(' ');
|
||||
assert.match(joined, /vsix-fetch-worker\.mjs/);
|
||||
assert.match(joined, /--url https:\/\/example\//);
|
||||
assert.match(joined, /--tmpdir \/tmp/);
|
||||
});
|
||||
});
|
||||
|
||||
describe('vsix-sandbox — runVsixWorker (live worker, no network)', () => {
|
||||
it('handles non-HTTPS URL: worker exits with ok:false and a fetch error', async () => {
|
||||
const dir = await mkdtemp(join(tmpdir(), 'llm-sec-vsix-test-'));
|
||||
try {
|
||||
const { ok, payload, sandbox } = await runVsixWorker('http://example.com/foo.vsix', dir);
|
||||
assert.equal(ok, false);
|
||||
assert.ok(payload.error, 'expected error message');
|
||||
assert.match(payload.error, /fetch failed|HTTPS|unsupported/i);
|
||||
// Sandbox may be 'sandbox-exec', 'bwrap', or null on Windows. All valid.
|
||||
assert.ok(sandbox === 'sandbox-exec' || sandbox === 'bwrap' || sandbox === null);
|
||||
} finally {
|
||||
await rm(dir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it('handles unsupported URL kind: worker exits with ok:false', async () => {
|
||||
const dir = await mkdtemp(join(tmpdir(), 'llm-sec-vsix-test-'));
|
||||
try {
|
||||
const { ok, payload } = await runVsixWorker('https://example.com/random.zip', dir);
|
||||
assert.equal(ok, false);
|
||||
assert.match(payload.error, /unsupported URL|fetch failed/i);
|
||||
} finally {
|
||||
await rm(dir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it('rejects when no --url or --tmpdir is provided (worker arg validation)', async () => {
|
||||
// Construct a minimal direct worker call without any args.
|
||||
const { spawn } = await import('node:child_process');
|
||||
const child = spawn('node', [__testing.WORKER_PATH], { stdio: ['ignore', 'pipe', 'pipe'] });
|
||||
let out = '';
|
||||
child.stdout.on('data', (c) => { out += c.toString('utf8'); });
|
||||
const code = await new Promise((resolve) => child.on('close', resolve));
|
||||
assert.equal(code, 1);
|
||||
const parsed = JSON.parse(out.trim());
|
||||
assert.equal(parsed.ok, false);
|
||||
assert.match(parsed.error, /missing --url or --tmpdir/);
|
||||
});
|
||||
});
|
||||
Loading…
Add table
Add a link
Reference in a new issue