feat(llm-security): /security ide-scan <url> — Marketplace/OpenVSX/direct VSIX (v6.4.0)
Pre-installation verification of VS Code extensions via URL — fetch a remote VSIX, extract it in a hardened sandbox, and run the existing IDE scanner pipeline against it. No npm dependencies. Sources: - VS Code Marketplace (publisher.gallery.vsassets.io direct download) - OpenVSX (open-vsx.org official API) - Direct .vsix HTTPS URLs Defenses: - HTTPS-only, TLS verified, manual redirect with per-source host whitelist - 30s total timeout via AbortController - 50MB compressed cap, 500MB uncompressed, 100x expansion ratio - Zero-dep ZIP extractor: zip-slip, absolute paths, drive letters, NUL bytes, symlinks (Unix mode 0xA000), depth limits, ZIP64 rejected, encrypted rejected - SHA-256 streamed during fetch, surfaced in meta.source - Temp dir cleanup in all paths (try/finally) Files: - scanners/lib/vsix-fetch.mjs (HTTPS fetcher, host whitelist, streaming SHA-256) - scanners/lib/zip-extract.mjs (zero-dep parser with hardening caps) - knowledge/marketplace-api-notes.md (endpoint reference) - 3 test files (48 tests added: vsix-fetch, zip-extract, ide-extension-url) Tests: 1296 → 1344 (all green). Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
parent
6252e55700
commit
fe0193956d
16 changed files with 1543 additions and 22 deletions
|
|
@ -22,21 +22,21 @@ Then open Claude Code and type `/plugin` to browse and install plugins from the
|
|||
|
||||
## Plugins
|
||||
|
||||
### [LLM Security](plugins/llm-security/) `v6.3.0`
|
||||
### [LLM Security](plugins/llm-security/) `v6.4.0`
|
||||
|
||||
Security scanning, auditing, and threat modeling for agentic AI projects.
|
||||
|
||||
Built on OWASP LLM Top 10 (2025), OWASP Agentic AI Top 10, and the AI Agent Traps taxonomy (Google DeepMind, 2025). Three layers of protection:
|
||||
|
||||
- **Automated enforcement** — 9 hooks that block dangerous operations in real time (prompt injection, secrets in code, destructive commands, supply chain guardrails, transcript scanning before context compaction)
|
||||
- **Deterministic scanning** — 22 Node.js scanners (10 orchestrated + 12 standalone) for byte-level analysis: Shannon entropy, Unicode codepoints, typosquatting detection, taint flow, DNS resolution, git forensics, AI-BOM, attack simulation, IDE extension prescan. Bash-normalize T1-T6 for obfuscation-resistant denylists
|
||||
- **Deterministic scanning** — 22 Node.js scanners (10 orchestrated + 12 standalone) for byte-level analysis: Shannon entropy, Unicode codepoints, typosquatting detection, taint flow, DNS resolution, git forensics, AI-BOM, attack simulation, IDE extension prescan (now with URL fetch from Marketplace / OpenVSX / direct VSIX, hardened ZIP extractor for zip-slip / symlinks / bombs). Bash-normalize T1-T6 for obfuscation-resistant denylists
|
||||
- **Advisory analysis** — 19 commands that scan, audit, and model threats with structured reports, letter grades, and actionable remediation
|
||||
- **Enterprise governance** — Compliance mapping (EU AI Act, NIST AI RMF, ISO 42001), SARIF 2.1.0 output, structured audit trail, policy-as-code, standalone CLI
|
||||
- **Opus 4.7 aligned** — Agent instructions rewritten for literal instruction-following (system card §6.3.1.1), defense-in-depth posture per §5.2.1, production hardening guide
|
||||
|
||||
Key commands: `/security posture`, `/security audit`, `/security scan`, `/security ide-scan`, `/security threat-model`, `/security plugin-audit`
|
||||
|
||||
6 specialized agents · 22 scanners · 9 hooks · 18 knowledge docs · 1296 tests
|
||||
6 specialized agents · 22 scanners · 9 hooks · 19 knowledge docs · 1344 tests
|
||||
|
||||
→ [Full documentation](plugins/llm-security/README.md)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"name": "llm-security",
|
||||
"description": "Security scanning, auditing, and threat modeling for Claude Code projects. Detects secrets, validates MCP servers, assesses security posture, and generates threat models aligned with OWASP LLM Top 10.",
|
||||
"version": "6.3.0"
|
||||
"version": "6.4.0"
|
||||
}
|
||||
|
|
|
|||
|
|
@ -4,6 +4,29 @@ All notable changes to the LLM Security Plugin are documented in this file.
|
|||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [6.4.0] - 2026-04-17
|
||||
|
||||
### Added
|
||||
- **`/security ide-scan <url>` — pre-install verification.** The IDE extension scanner now accepts URLs as targets and fetches the VSIX before scanning. Supported sources:
|
||||
- VS Code Marketplace: `https://marketplace.visualstudio.com/items?itemName=publisher.name`
|
||||
- OpenVSX: `https://open-vsx.org/extension/publisher/name[/version]`
|
||||
- Direct VSIX download: `https://example.com/path/foo.vsix` (HTTPS only)
|
||||
- **`scanners/lib/vsix-fetch.mjs`** — HTTPS-only fetcher with 50 MB compressed cap, 30 s total timeout, SHA-256 streamed during download, manual redirect handling with per-source host whitelist (Marketplace gallerycdn, OpenVSX blob storage). No npm dependencies — uses Node 18+ `fetch`
|
||||
- **`scanners/lib/zip-extract.mjs`** — Zero-dependency ZIP parser + safe extractor. Rejects: zip-slip via `..` paths, POSIX absolute paths, Windows drive letters, NUL bytes, encrypted entries, ZIP64, multi-disk archives, unsupported compression methods, symlink entries (Unix `0xA000` mode bits in `external_attr`). Caps: 10 000 entries, 500 MB uncompressed total, 100× expansion ratio (sum-uncomp / sum-comp), depth 20. STORE + DEFLATE only
|
||||
- **Envelope `meta.source`** — When invoked with a URL, the scan envelope's `meta.source` field carries `{ type: "url", kind, url, finalUrl, sha256, size, publisher, name, version, requestedUrl }` so reports can attribute findings to the upstream artifact
|
||||
- **`knowledge/marketplace-api-notes.md`** — Reference notes for the (undocumented but stable) Marketplace direct-download endpoint and the (officially documented) OpenVSX endpoints used by `vsix-fetch.mjs`
|
||||
- **48 new tests** across `tests/scanners/zip-extract.test.mjs` (validateEntryName / isSymlink / extractToDir happy + adversarial), `tests/scanners/vsix-fetch.test.mjs` (detectUrlType / isAllowedHost / readBodyCapped), `tests/scanners/ide-extension-url.test.mjs` (URL flow integration with `global.fetch` mock — Marketplace, OpenVSX, direct VSIX, malformed VSIX, zip-slip VSIX, network failure, unsupported URL, GitHub URL). 1344 tests total (was 1296). Test helper: `tests/lib/build-zip.mjs` builds adversarial ZIPs that real `zip` tools refuse to emit
|
||||
|
||||
### Changed
|
||||
- `scanners/ide-extension-scanner.mjs` early-detects URL targets and routes through fetch + extract → temp dir → existing single-target scan path. Temp directory cleaned in `try/finally` regardless of success/error/abort
|
||||
- CLI help text in `bin/llm-security.mjs` and `commands/ide-scan.md` updated with URL examples and security model
|
||||
- Version bump: 6.3.0 → 6.4.0 across all files
|
||||
|
||||
### Not supported (intentional)
|
||||
- GitHub repo URLs — would require `npm install` + `vsce package` build step. Use the Marketplace, OpenVSX, or a direct `.vsix` URL instead
|
||||
- VSIX `.signature.p7s` verification — deferred to v6.5.0 (requires X.509 / PKCS#7 parsing)
|
||||
- ZIP64 archives — real-world VSIX never approaches the 4 GB threshold
|
||||
|
||||
## [6.3.0] - 2026-04-17
|
||||
|
||||
### Added
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
# LLM Security Plugin (v6.3.0)
|
||||
# LLM Security Plugin (v6.4.0)
|
||||
|
||||
Security scanning, auditing, and threat modeling for Claude Code projects. 5 frameworks: OWASP LLM Top 10, Agentic AI Top 10 (ASI), Skills Top 10 (AST), MCP Top 10, AI Agent Traps (DeepMind). 1296 tests.
|
||||
Security scanning, auditing, and threat modeling for Claude Code projects. 5 frameworks: OWASP LLM Top 10, Agentic AI Top 10 (ASI), Skills Top 10 (AST), MCP Top 10, AI Agent Traps (DeepMind). 1344 tests.
|
||||
|
||||
## Commands
|
||||
|
||||
|
|
@ -13,7 +13,7 @@ Security scanning, auditing, and threat modeling for Claude Code projects. 5 fra
|
|||
| `/security plugin-audit [path\|url]` | Plugin trust assessment (local or GitHub URL) |
|
||||
| `/security mcp-audit [--live]` | MCP server config audit (add `--live` for runtime inspection) |
|
||||
| `/security mcp-inspect` | Live MCP server inspection — connect via JSON-RPC 2.0, scan tool descriptions |
|
||||
| `/security ide-scan [target]` | Scan installed VS Code / JetBrains extensions — typosquat, theme-with-code, sideload, broad activation, uninstall hooks. Orchestrates reused scanners (UNI/ENT/NET/TNT/MEM/SCR) per extension. Offline by default, `--online` opt-in |
|
||||
| `/security ide-scan [target\|url]` | Scan installed VS Code / JetBrains extensions — OR fetch a remote VSIX from Marketplace, OpenVSX, or direct URL (v6.4.0). Typosquat, theme-with-code, sideload, broad activation, uninstall hooks. Hardened ZIP extractor (zip-slip, symlink, bomb, ratio caps). Orchestrates reused scanners (UNI/ENT/NET/TNT/MEM/SCR) per extension. Offline by default, `--online` opt-in |
|
||||
| `/security posture` | Quick scorecard (13 categories) |
|
||||
| `/security threat-model` | Interactive STRIDE/MAESTRO session |
|
||||
| `/security diff [path]` | Compare scan against baseline — shows new/resolved/unchanged/moved |
|
||||
|
|
@ -94,7 +94,11 @@ Scanner prefix: MCI. OWASP: MCP03, MCP06, MCP09. Invoked by `mcp-inspect` and `m
|
|||
|
||||
`attack-simulator.mjs` — red-team harness. Data-driven: 64 scenarios in 12 categories from `knowledge/attack-scenarios.json`. Payloads constructed at runtime (fragment assembly to avoid triggering hooks on source). Uses `runHook()` from test helper. Adaptive mode (`--adaptive`): 5 mutation rounds per passing scenario (homoglyph, encoding, zero-width, case alternation, synonym). Mutation rules in `knowledge/attack-mutations.json`. Benchmark mode (`--benchmark`): outputs structured pass/fail metrics. Run: `node scanners/attack-simulator.mjs [--category <name>] [--json] [--verbose] [--adaptive] [--benchmark]`
|
||||
`ai-bom-generator.mjs` — AI Bill of Materials generator. Discovers AI components (models, MCP servers, plugins, knowledge, hooks) and outputs CycloneDX 1.6 JSON. Scanner prefix: BOM. Run: `node scanners/ai-bom-generator.mjs <target> [--output-file <path>]`
|
||||
`ide-extension-scanner.mjs` — scans installed VS Code (and forks: Cursor, Windsurf, VSCodium, code-server, Insiders, Remote-SSH) extensions. OS-aware discovery of `~/.vscode/extensions/` etc. via `lib/ide-extension-discovery.mjs`. Parses each `package.json` via `lib/ide-extension-parser.mjs`. 7 IDE-specific checks: blocklist match, theme-with-code, sideload (vsix), broad activation (`*` / `onStartupFinished`), typosquat (Levenshtein ≤2 vs top-100), extension-pack expansion, dangerous `vscode:uninstall` hooks. Then orchestrates reused scanners (UNI/ENT/NET/TNT/MEM/SCR) per extension with bounded concurrency (default 4). Scanner prefix: IDE. OWASP: LLM01, LLM02, LLM03, LLM06, ASI02, ASI04. Offline by default, `--online` opt-in for Marketplace/OSV.dev lookups. Knowledge: `knowledge/top-vscode-extensions.json` (typosquat seed + blocklist), `knowledge/ide-extension-threat-patterns.md`. JetBrains discovery is a v1.1 stub. Run: `node scanners/ide-extension-scanner.mjs [target] [--vscode-only] [--intellij-only] [--include-builtin] [--online] [--format json|compact] [--fail-on <sev>] [--output-file <path>]`. Invoked by `/security ide-scan`.
|
||||
`ide-extension-scanner.mjs` — scans installed VS Code (and forks: Cursor, Windsurf, VSCodium, code-server, Insiders, Remote-SSH) extensions. OS-aware discovery of `~/.vscode/extensions/` etc. via `lib/ide-extension-discovery.mjs`. Parses each `package.json` via `lib/ide-extension-parser.mjs`. 7 IDE-specific checks: blocklist match, theme-with-code, sideload (vsix), broad activation (`*` / `onStartupFinished`), typosquat (Levenshtein ≤2 vs top-100), extension-pack expansion, dangerous `vscode:uninstall` hooks. Then orchestrates reused scanners (UNI/ENT/NET/TNT/MEM/SCR) per extension with bounded concurrency (default 4). Scanner prefix: IDE. OWASP: LLM01, LLM02, LLM03, LLM06, ASI02, ASI04. Offline by default, `--online` opt-in for Marketplace/OSV.dev lookups. Knowledge: `knowledge/top-vscode-extensions.json` (typosquat seed + blocklist), `knowledge/ide-extension-threat-patterns.md`, `knowledge/marketplace-api-notes.md`. JetBrains discovery is a v1.1 stub.
|
||||
|
||||
**v6.4.0 — URL support.** Targets can be Marketplace, OpenVSX, or direct `.vsix` URLs. Pipeline: `lib/vsix-fetch.mjs` (HTTPS-only fetch with 50MB cap, 30s timeout, SHA-256, manual redirect host whitelist) → `lib/zip-extract.mjs` (zero-dep ZIP parser, rejects zip-slip/symlink/absolute/drive-letter/encrypted/ZIP64, caps: 10 000 entries, 500MB uncomp, 100x ratio, depth 20) → existing scan pipeline against extracted `extension/` subdir → temp dir always cleaned in `try/finally`. Envelope.meta.source = `{ type: "url", kind, url, finalUrl, sha256, size, publisher?, name?, version? }`.
|
||||
|
||||
Run: `node scanners/ide-extension-scanner.mjs [target|url] [--vscode-only] [--intellij-only] [--include-builtin] [--online] [--format json|compact] [--fail-on <sev>] [--output-file <path>]`. Invoked by `/security ide-scan`.
|
||||
|
||||
## Token Budget (ENFORCED)
|
||||
|
||||
|
|
@ -119,7 +123,7 @@ Pipeline templates in `ci/`: `github-action.yml`, `azure-pipelines.yml`, `gitlab
|
|||
All templates use `--fail-on high --format sarif --output-file results.sarif` with SARIF upload per platform.
|
||||
Standalone CLI makes zero network calls (except opt-in OSV.dev in supply-chain-recheck). Fully Schrems II compatible.
|
||||
|
||||
## Knowledge Files (18)
|
||||
## Knowledge Files (19)
|
||||
|
||||
| File | Content |
|
||||
|------|---------|
|
||||
|
|
@ -141,6 +145,7 @@ Standalone CLI makes zero network calls (except opt-in OSV.dev in supply-chain-r
|
|||
| `ide-extension-threat-patterns.md` | 10 IDE-extension detection categories (VS Code + JetBrains) with 2024-2026 case studies |
|
||||
| `top-vscode-extensions.json` | Top ~100 VS Code Marketplace extension IDs (typosquat seed) + blocklist entries |
|
||||
| `top-jetbrains-plugins.json` | JetBrains plugin seed data (v1.1 stub — deferred) |
|
||||
| `marketplace-api-notes.md` | VS Code Marketplace + OpenVSX API endpoints used by `lib/vsix-fetch.mjs` (v6.4.0) |
|
||||
|
||||
## Reports
|
||||
|
||||
|
|
|
|||
|
|
@ -4,12 +4,12 @@
|
|||
|
||||
*Built for my own Claude Code workflow and shared openly for anyone who finds it useful. This is a solo project — bug reports and feature requests are welcome, but pull requests are not accepted.*
|
||||
|
||||

|
||||

|
||||

|
||||

|
||||

|
||||

|
||||

|
||||

|
||||

|
||||
|
||||
A Claude Code plugin that provides security scanning, auditing, and threat modeling for agentic AI projects. Built on [OWASP LLM Top 10 (2025)](https://genai.owasp.org/llm-top-10/), [OWASP Agentic AI Top 10](https://genai.owasp.org/agentic-ai/), and the [AI Agent Traps](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6372438) taxonomy (Google DeepMind, 2025), with threat intelligence from ToxicSkills, ClawHavoc, MCPTox, Pillar Security, Invariant Labs, and Operant AI research.
|
||||
|
|
@ -165,7 +165,7 @@ Or enable directly in `~/.claude/settings.json`:
|
|||
| `/security plugin-audit [path\|url]` | Dedicated plugin security audit with Install/Review/Do Not Install verdict (local or GitHub URL) |
|
||||
| `/security mcp-audit [--live]` | Focused audit of all installed MCP server configurations (add `--live` for runtime inspection) |
|
||||
| `/security mcp-inspect` | Connect to running MCP stdio servers and scan live tool descriptions |
|
||||
| `/security ide-scan [target]` | Scan installed VS Code (+ Cursor, Windsurf, VSCodium, code-server) / JetBrains extensions — typosquat, theme-with-code, sideload, broad activation, uninstall hooks, plus UNI/ENT/NET/TNT/MEM/SCR per extension. Offline by default |
|
||||
| `/security ide-scan [target\|url]` | Scan installed VS Code (+ Cursor, Windsurf, VSCodium, code-server) / JetBrains extensions — OR fetch a remote VSIX from VS Code Marketplace, OpenVSX, or direct `.vsix` URL (v6.4.0). Typosquat, theme-with-code, sideload, broad activation, uninstall hooks, plus UNI/ENT/NET/TNT/MEM/SCR per extension. Offline by default |
|
||||
| `/security posture` | Quick security posture scorecard (16 categories incl. compliance) |
|
||||
| `/security diff [path]` | Compare scan against stored baseline — shows new/resolved/unchanged/moved findings |
|
||||
| `/security watch [path] [--interval 6h]` | Continuous monitoring — runs diff on a recurring interval via /loop |
|
||||
|
|
@ -822,6 +822,7 @@ This plugin provides full-stack security hardening (static analysis + supply cha
|
|||
|
||||
| Version | Date | Highlights |
|
||||
|---------|------|------------|
|
||||
| **6.4.0** | 2026-04-17 | **`/security ide-scan <url>` — pre-install verification.** The IDE extension scanner now accepts URLs and fetches the VSIX before scanning. Supported: VS Code Marketplace (`https://marketplace.visualstudio.com/items?itemName=publisher.name`), OpenVSX (`https://open-vsx.org/extension/publisher/name[/version]`), and direct `.vsix` URLs. New libraries: `lib/vsix-fetch.mjs` (HTTPS-only fetch with 50MB cap, 30s timeout, SHA-256, manual host-whitelisted redirects) and `lib/zip-extract.mjs` (zero-dep ZIP parser, rejects zip-slip / symlinks / absolute paths / drive letters / encrypted entries / ZIP64; caps: 10 000 entries, 500MB uncompressed, 100x expansion ratio, depth 20). Temp dir always cleaned in `try/finally`. Envelope `meta.source` carries `{ type: "url", kind, url, finalUrl, sha256, size, publisher, name, version }`. New knowledge file: `marketplace-api-notes.md`. GitHub repo URLs intentionally not supported (would require a build step). 1344 tests (was 1296). |
|
||||
| **6.3.0** | 2026-04-17 | **IDE extension prescan.** New `/security ide-scan` command and `ide-extension-scanner.mjs` (prefix IDE) discover and audit installed VS Code extensions (and forks: Cursor, Windsurf, VSCodium, code-server, Insiders, Remote-SSH; JetBrains is a v1.1 stub). 7 IDE-specific checks: blocklist match, theme-with-code, sideload (`.vsix`), broad activation (`*`, `onStartupFinished`), Levenshtein typosquat ≤2 vs top-100, extension-pack expansion, dangerous `vscode:uninstall` hooks. Per-extension orchestration of UNI/ENT/NET/TNT/MEM/SCR scanners with bounded concurrency. OS-aware discovery via `lib/ide-extension-discovery.mjs` (Platform-specific suffix parsing for `darwin-x64`, `linux-arm64`, etc.). Offline-first; `--online` opt-in for future Marketplace/OSV.dev lookups. New knowledge files: `ide-extension-threat-patterns.md` (10 categories, 2024-2026 case studies from Koi Security — GlassWorm, WhiteCobra, TigerJack, Material Theme), `top-vscode-extensions.json` (typosquat seed + blocklist), `top-jetbrains-plugins.json` (stub). 1296 tests (was 1274). |
|
||||
| **6.2.0** | 2026-04-17 | **Opus 4.7 + Claude Code 2.1.112 alignment.** Bash-normalize extended with T5 (`${IFS}` word-splitting) and T6 (ANSI-C `$'\xHH'` hex quoting) layers. New `pre-compact-scan.mjs` PreCompact hook — scans transcript tail (500 KB cap, <500 ms) for injection + credentials before context compaction. Modes: `block` / `warn` / `off` via `LLM_SECURITY_PRECOMPACT_MODE`. Agent files reframed for Opus 4.7's more literal instruction-following (Step 0 generaliseringsgrense + parallell Read-hint in skill-scanner + mcp-scanner). New `docs/security-hardening-guide.md` with env-var reference, sandboxing notes, system-card §5.2.1 / §6.3.1.1 mapping. CLAUDE.md Defense Philosophy links to system card. 1274 tests (was 1264). |
|
||||
| **6.1.0** | 2026-04-10 | **CI/CD integration.** `--fail-on <severity>` flag for threshold-based exit codes (exit 1 if findings at/above level). `--compact` output mode (one-liner per finding). Policy `ci` section in `policy.json`. Pipeline templates: GitHub Actions, Azure DevOps, GitLab CI with SARIF upload. CI/CD guide (`docs/ci-cd-guide.md`) with Schrems II/NSM compliance docs. npm publish preparation (`files` whitelist). 1264 tests. |
|
||||
|
|
|
|||
|
|
@ -26,9 +26,12 @@ Commands:
|
|||
Quick security posture assessment (16 categories)
|
||||
audit-bom <target> [--output-file <path>]
|
||||
Generate AI Bill of Materials (CycloneDX 1.6)
|
||||
ide-scan [target] [--vscode-only] [--intellij-only] [--include-builtin]
|
||||
ide-scan [target|url] [--vscode-only] [--intellij-only] [--include-builtin]
|
||||
[--online] [--format compact|json] [--fail-on <severity>]
|
||||
Scan installed VS Code / JetBrains extensions (offline by default)
|
||||
Scan installed VS Code / JetBrains extensions, OR fetch a remote VSIX:
|
||||
- https://marketplace.visualstudio.com/items?itemName=publisher.name
|
||||
- https://open-vsx.org/extension/publisher/name[/version]
|
||||
- https://example.com/foo.vsix (direct .vsix download)
|
||||
benchmark [--adaptive] [--category <name>]
|
||||
Run attack simulation benchmark
|
||||
|
||||
|
|
|
|||
|
|
@ -20,13 +20,26 @@ node <this plugin's scanners/ide-extension-scanner.mjs> [target]
|
|||
```
|
||||
|
||||
Arguments (pass through as provided by the user):
|
||||
- `[target]` — omit, `.`, or `all` to discover all installed extensions. Absolute path to an extracted extension directory for single-scan mode.
|
||||
- `[target]` — one of:
|
||||
- omit, `.`, or `all` → discover all installed extensions
|
||||
- absolute path to an extracted extension directory → single-scan mode
|
||||
- `https://marketplace.visualstudio.com/items?itemName=<publisher>.<name>` → fetch from VS Code Marketplace
|
||||
- `https://open-vsx.org/extension/<publisher>/<name>[/<version>]` → fetch from OpenVSX
|
||||
- `https://example.com/path/foo.vsix` → direct VSIX download (HTTPS only)
|
||||
- GitHub repo URLs are NOT supported in v6.4.0 (would require build step)
|
||||
- `--vscode-only` / `--intellij-only` — restrict discovery
|
||||
- `--include-builtin` — include Microsoft builtin extensions (default: excluded)
|
||||
- `--online` — enable Marketplace/OSV.dev lookups (opt-in; default: fully offline)
|
||||
- `--format compact|json` — output format
|
||||
- `--fail-on <severity>` — exit 1 if findings at/above severity
|
||||
|
||||
URL mode notes:
|
||||
- Hardened ZIP extractor with caps: 50MB compressed, 500MB uncompressed, 100x expansion ratio, 10 000 entries, depth 20.
|
||||
- Rejects: zip-slip paths, symlink entries, absolute paths, drive letters, encrypted entries, ZIP64.
|
||||
- TLS verified, HTTPS only, 30s timeout. Cross-host redirects rejected.
|
||||
- Temp directory always cleaned up (success, error, abort).
|
||||
- `meta.source` in the envelope contains `{ type: "url", kind, url, finalUrl, sha256, size, publisher, name, version }`.
|
||||
|
||||
Parse the JSON output. The result contains:
|
||||
- `meta.scanner`, `meta.version`, `meta.target`, `meta.extensions_discovered` (per type), `meta.roots_scanned`, `meta.warnings`
|
||||
- `extensions[]` — per-extension results with `id`, `version`, `type`, `publisher`, `source`, `is_builtin`, `signed`, `scanner_results` (IDE/UNI/ENT/NET/TNT/MEM/SCR), `aggregate` (counts, risk_score, risk_band, verdict), `warnings`
|
||||
|
|
|
|||
84
plugins/llm-security/knowledge/marketplace-api-notes.md
Normal file
84
plugins/llm-security/knowledge/marketplace-api-notes.md
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
# VS Code Marketplace + OpenVSX API notes
|
||||
|
||||
Reference notes for `scanners/lib/vsix-fetch.mjs`. These endpoints are used to
|
||||
download VSIX packages for `/security ide-scan <url>` (v6.4.0).
|
||||
|
||||
## VS Code Marketplace
|
||||
|
||||
**Status:** Undocumented but stable. Used by the `vsce` CLI and by VS Code itself.
|
||||
|
||||
### Direct VSIX download (the URL we use)
|
||||
|
||||
```
|
||||
https://{publisher}.gallery.vsassets.io/_apis/public/gallery/publisher/{publisher}/extension/{name}/latest/assetbyname/Microsoft.VisualStudio.Services.VSIXPackage
|
||||
```
|
||||
|
||||
- `{publisher}` and `{name}` come from the `itemName=publisher.name` query
|
||||
parameter on `https://marketplace.visualstudio.com/items`.
|
||||
- `latest` resolves to the most recent stable version. Specific versions can
|
||||
be requested by replacing `latest` with `<version>`.
|
||||
- The response is a ZIP (VSIX) with `Content-Type: application/octet-stream`.
|
||||
- May redirect to `*.gallerycdn.vsassets.io`. Our fetcher allows redirects only
|
||||
to that host family, never to arbitrary hosts.
|
||||
|
||||
### `extensionquery` (not used here, listed for completeness)
|
||||
|
||||
```
|
||||
POST https://marketplace.visualstudio.com/_apis/public/gallery/extensionquery
|
||||
Headers:
|
||||
Accept: application/json;api-version=3.0-preview.1
|
||||
Content-Type: application/json
|
||||
Body:
|
||||
{ "filters": [{ "criteria": [{ "filterType": 7, "value": "publisher.name" }] }],
|
||||
"flags": 914 }
|
||||
```
|
||||
|
||||
This returns metadata (versions, publisher info, statistics) but is heavier
|
||||
than the direct download, and parsing the response shape is brittle. We keep
|
||||
the direct download path for v6.4.0.
|
||||
|
||||
### Stability risk
|
||||
|
||||
Microsoft has changed Marketplace APIs in the past without warning. Mitigation:
|
||||
|
||||
- Fall back to OpenVSX when both options exist (most extensions are mirrored).
|
||||
- Document the endpoint here so that breakage can be diagnosed quickly.
|
||||
- All callers receive a single `Error` with a descriptive message — no stack
|
||||
traces leak through to the scanner envelope.
|
||||
|
||||
## OpenVSX (Eclipse Foundation)
|
||||
|
||||
**Status:** Officially documented at https://open-vsx.org/swagger-ui.
|
||||
|
||||
### Resolve "latest" version
|
||||
|
||||
```
|
||||
GET https://open-vsx.org/api/{publisher}/{name}/latest
|
||||
```
|
||||
|
||||
Returns JSON. We extract `.version` and use it for the next request.
|
||||
|
||||
### Direct VSIX download
|
||||
|
||||
```
|
||||
GET https://open-vsx.org/api/{publisher}/{name}/{version}/file/{publisher}.{name}-{version}.vsix
|
||||
```
|
||||
|
||||
Returns the raw VSIX. May redirect to `openvsxorg.blob.core.windows.net`.
|
||||
|
||||
## Caps & defenses (shared by all sources)
|
||||
|
||||
- TLS verification enabled (no `--insecure` opt-in).
|
||||
- HTTPS only. Plain HTTP is rejected at `detectUrlType` and at fetch time.
|
||||
- Manual redirect handling. Allowed hosts whitelisted per source type.
|
||||
- 30-second total timeout via `AbortController`.
|
||||
- 50MB compressed VSIX cap. Streaming reader aborts when cap exceeded.
|
||||
- SHA-256 computed during streaming for `meta.source.sha256`.
|
||||
|
||||
## What is NOT supported (v6.4.0)
|
||||
|
||||
- GitHub repo URLs — would need `npm install` + `vsce package` build step.
|
||||
- VS Code `code:` protocol URIs.
|
||||
- VSIX signature verification (`.signature.p7s`). Deferred to v6.5.0.
|
||||
- ZIP64 archives. Real VSIX never approaches the 4 GB threshold.
|
||||
- Encrypted ZIP entries (general-purpose flag bit 0).
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
{
|
||||
"name": "llm-security",
|
||||
"version": "6.3.0",
|
||||
"version": "6.4.0",
|
||||
"description": "Security scanning, auditing, and threat modeling for Claude Code projects",
|
||||
"type": "module",
|
||||
"bin": {
|
||||
|
|
|
|||
|
|
@ -13,7 +13,9 @@
|
|||
// Library: import { scan, discoverAll } from './ide-extension-scanner.mjs'
|
||||
|
||||
import { resolve, join, relative } from 'node:path';
|
||||
import { writeFileSync } from 'node:fs';
|
||||
import { writeFileSync, existsSync } from 'node:fs';
|
||||
import { mkdtemp, rm, stat } from 'node:fs/promises';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { discoverFiles } from './lib/file-discovery.mjs';
|
||||
import { finding, scannerResult } from './lib/output.mjs';
|
||||
|
|
@ -25,6 +27,8 @@ import {
|
|||
} from './lib/ide-extension-discovery.mjs';
|
||||
import { parseVSCodeExtension, parseVsixFile } from './lib/ide-extension-parser.mjs';
|
||||
import { loadTopVSCode, loadVSCodeBlocklist, normalizeId } from './lib/ide-extension-data.mjs';
|
||||
import { fetchVsixFromUrl, detectUrlType } from './lib/vsix-fetch.mjs';
|
||||
import { extractToDir, ZipError } from './lib/zip-extract.mjs';
|
||||
|
||||
import { scan as scanUnicode } from './unicode-scanner.mjs';
|
||||
import { scan as scanEntropy } from './entropy-scanner.mjs';
|
||||
|
|
@ -33,9 +37,66 @@ import { scan as scanTaint } from './taint-tracer.mjs';
|
|||
import { scan as scanMemoryPoisoning } from './memory-poisoning-scanner.mjs';
|
||||
import { scan as scanSupplyChain } from './supply-chain-recheck.mjs';
|
||||
|
||||
const VERSION = '6.3.0';
|
||||
const VERSION = '6.4.0';
|
||||
const SCANNER = 'IDE';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// URL → temp dir orchestration
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function isUrlTarget(target) {
|
||||
return typeof target === 'string' && /^https?:\/\//i.test(target);
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch a VSIX from a URL, extract it to a temp dir, and return the path that
|
||||
* `parseVSCodeExtension` should be pointed at. VSIX layout always nests the
|
||||
* extension under `extension/`.
|
||||
*
|
||||
* Caller MUST `await rm(result.tempDir, { recursive: true, force: true })` in finally.
|
||||
*
|
||||
* @param {string} url
|
||||
* @returns {Promise<{ extRoot: string, tempDir: string, source: object }>}
|
||||
*/
|
||||
async function fetchAndExtractVsixUrl(url) {
|
||||
const tempDir = await mkdtemp(join(tmpdir(), 'llm-sec-vsix-'));
|
||||
try {
|
||||
let fetched;
|
||||
try {
|
||||
fetched = await fetchVsixFromUrl(url);
|
||||
} catch (err) {
|
||||
throw new Error(`fetch failed: ${err.message}`);
|
||||
}
|
||||
try {
|
||||
await extractToDir(fetched.buffer, tempDir);
|
||||
} catch (err) {
|
||||
if (err instanceof ZipError) {
|
||||
throw new Error(`malformed VSIX (${err.code}): ${err.message}`);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
// VSIX nests files under `extension/`. If that doesn't exist, fall back to
|
||||
// the temp dir itself (some packagers omit the wrapper).
|
||||
const nested = join(tempDir, 'extension');
|
||||
const extRoot = existsSync(nested) ? nested : tempDir;
|
||||
const { type: kind, ...sourceMeta } = fetched.source;
|
||||
const source = {
|
||||
type: 'url',
|
||||
kind, // 'marketplace' | 'openvsx' | 'vsix'
|
||||
url,
|
||||
finalUrl: fetched.finalUrl,
|
||||
sha256: fetched.sha256,
|
||||
size: fetched.size,
|
||||
...sourceMeta,
|
||||
};
|
||||
return { extRoot, tempDir, source };
|
||||
} catch (err) {
|
||||
// Cleanup on error before propagating.
|
||||
await rm(tempDir, { recursive: true, force: true }).catch(() => {});
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// IDE-specific checks (operate on parsed manifest)
|
||||
// ---------------------------------------------------------------------------
|
||||
|
|
@ -386,10 +447,38 @@ export async function scan(target, options = {}) {
|
|||
const warnings = [];
|
||||
let extensions = [];
|
||||
let rootsScanned = [];
|
||||
let urlSource = null;
|
||||
let urlTempDir = null;
|
||||
|
||||
const singleTargetPath = target && target !== '.' && target !== 'all' ? resolve(target) : null;
|
||||
// URL mode: fetch VSIX, extract to temp dir, then treat extracted dir as single target.
|
||||
if (isUrlTarget(target)) {
|
||||
const detected = detectUrlType(target);
|
||||
if (detected.type === 'unknown') {
|
||||
warnings.push(`unsupported URL: ${target} (expected VS Code Marketplace, OpenVSX, or direct .vsix)`);
|
||||
} else if (detected.type === 'github') {
|
||||
warnings.push('GitHub repo URLs are not supported in v6.4.0 — would require build step. Use the Marketplace, OpenVSX, or a direct .vsix link.');
|
||||
} else {
|
||||
try {
|
||||
const fetched = await fetchAndExtractVsixUrl(target);
|
||||
urlSource = fetched.source;
|
||||
urlTempDir = fetched.tempDir;
|
||||
target = fetched.extRoot; // forward into single-target path mode
|
||||
} catch (err) {
|
||||
warnings.push(`URL fetch/extract failed: ${err.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (singleTargetPath) {
|
||||
const urlFetchFailed = isUrlTarget(target) && !urlSource;
|
||||
const singleTargetPath = target && target !== '.' && target !== 'all' && !isUrlTarget(target)
|
||||
? resolve(target)
|
||||
: null;
|
||||
|
||||
try {
|
||||
|
||||
if (urlFetchFailed) {
|
||||
// Don't fall through to discovery when the user asked for a specific URL.
|
||||
} else if (singleTargetPath) {
|
||||
// Single-directory mode
|
||||
const parsed = await parseVSCodeExtension(singleTargetPath);
|
||||
if (!parsed) {
|
||||
|
|
@ -453,7 +542,7 @@ export async function scan(target, options = {}) {
|
|||
meta: {
|
||||
scanner: 'ide-extension-scanner',
|
||||
version: VERSION,
|
||||
target: singleTargetPath || (target || 'discover-all'),
|
||||
target: urlSource ? urlSource.url : (singleTargetPath || (target || 'discover-all')),
|
||||
timestamp: new Date().toISOString(),
|
||||
node_version: process.version,
|
||||
duration_ms: Date.now() - started,
|
||||
|
|
@ -463,6 +552,7 @@ export async function scan(target, options = {}) {
|
|||
},
|
||||
roots_scanned: rootsScanned,
|
||||
online: options.online === true,
|
||||
source: urlSource,
|
||||
warnings,
|
||||
},
|
||||
extensions: perExt,
|
||||
|
|
@ -476,6 +566,11 @@ export async function scan(target, options = {}) {
|
|||
extensions_warning: warningCount,
|
||||
},
|
||||
};
|
||||
} finally {
|
||||
if (urlTempDir) {
|
||||
await rm(urlTempDir, { recursive: true, force: true }).catch(() => {});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -535,7 +630,10 @@ async function main() {
|
|||
console.log(`ide-extension-scanner v${VERSION}
|
||||
Usage: node ide-extension-scanner.mjs [target] [options]
|
||||
|
||||
target: omitted/"."/"all" = discover all installed; path to extracted extension directory = single scan
|
||||
target: omitted/"."/"all" = discover all installed; path to extracted extension directory = single scan;
|
||||
https://marketplace.visualstudio.com/items?itemName=publisher.name = fetch from Marketplace;
|
||||
https://open-vsx.org/extension/publisher/name[/version] = fetch from OpenVSX;
|
||||
https://example.com/path/foo.vsix = direct VSIX download
|
||||
|
||||
Options:
|
||||
--vscode-only Skip JetBrains discovery
|
||||
|
|
|
|||
285
plugins/llm-security/scanners/lib/vsix-fetch.mjs
Normal file
285
plugins/llm-security/scanners/lib/vsix-fetch.mjs
Normal file
|
|
@ -0,0 +1,285 @@
|
|||
// vsix-fetch.mjs — Fetch VSIX packages from VS Code Marketplace, OpenVSX, or direct URL.
|
||||
// Zero dependencies. Streams to memory with strict size cap, computes SHA-256 on the fly.
|
||||
//
|
||||
// Defenses:
|
||||
// - HTTPS only (no plain HTTP, no protocol downgrade on redirects)
|
||||
// - 30s total timeout (network + body)
|
||||
// - 50MB max compressed size (abort streaming when exceeded)
|
||||
// - TLS verification always enabled
|
||||
// - No follow on cross-origin redirects (same registered host only)
|
||||
// - Marketplace endpoint is undocumented but stable; documented in
|
||||
// knowledge/marketplace-api-notes.md.
|
||||
|
||||
import { createHash } from 'node:crypto';
|
||||
|
||||
const MAX_VSIX_BYTES = 50 * 1024 * 1024; // 50MB
|
||||
const FETCH_TIMEOUT_MS = 30_000;
|
||||
|
||||
const MARKETPLACE_HOSTS = new Set([
|
||||
'marketplace.visualstudio.com',
|
||||
]);
|
||||
const OPENVSX_HOSTS = new Set([
|
||||
'open-vsx.org',
|
||||
]);
|
||||
|
||||
/**
|
||||
* Detect what kind of URL this is.
|
||||
* @param {string} url
|
||||
* @returns {{ type: 'marketplace'|'openvsx'|'vsix'|'github'|'unknown', publisher?: string, name?: string, version?: string }}
|
||||
*/
|
||||
export function detectUrlType(url) {
|
||||
let u;
|
||||
try { u = new URL(url); } catch { return { type: 'unknown' }; }
|
||||
if (u.protocol !== 'https:') return { type: 'unknown' };
|
||||
|
||||
// VS Code Marketplace: items?itemName=publisher.name
|
||||
if (MARKETPLACE_HOSTS.has(u.hostname)) {
|
||||
const itemName = u.searchParams.get('itemName');
|
||||
if (!itemName || !itemName.includes('.')) return { type: 'unknown' };
|
||||
const dot = itemName.indexOf('.');
|
||||
const publisher = itemName.slice(0, dot);
|
||||
const name = itemName.slice(dot + 1);
|
||||
if (!publisher || !name) return { type: 'unknown' };
|
||||
return { type: 'marketplace', publisher, name };
|
||||
}
|
||||
|
||||
// OpenVSX: /extension/{publisher}/{name}[/{version}]
|
||||
if (OPENVSX_HOSTS.has(u.hostname)) {
|
||||
const parts = u.pathname.split('/').filter(Boolean);
|
||||
if (parts[0] !== 'extension' || parts.length < 3) return { type: 'unknown' };
|
||||
const [, publisher, name, version] = parts;
|
||||
return { type: 'openvsx', publisher, name, version: version || null };
|
||||
}
|
||||
|
||||
// GitHub repo (not supported in v6.4.0)
|
||||
if (u.hostname === 'github.com') {
|
||||
return { type: 'github' };
|
||||
}
|
||||
|
||||
// Direct .vsix link
|
||||
if (u.pathname.toLowerCase().endsWith('.vsix')) {
|
||||
return { type: 'vsix' };
|
||||
}
|
||||
|
||||
return { type: 'unknown' };
|
||||
}
|
||||
|
||||
function isAllowedHost(hostname, originalType) {
|
||||
if (originalType === 'marketplace') {
|
||||
// Marketplace API redirects to vsassets cdn (vstmrblob).
|
||||
return MARKETPLACE_HOSTS.has(hostname)
|
||||
|| hostname.endsWith('.gallerycdn.vsassets.io')
|
||||
|| hostname.endsWith('.vsassets.io');
|
||||
}
|
||||
if (originalType === 'openvsx') {
|
||||
return OPENVSX_HOSTS.has(hostname)
|
||||
|| hostname === 'openvsxorg.blob.core.windows.net'
|
||||
|| hostname.endsWith('.openvsx.org');
|
||||
}
|
||||
// Direct vsix: only same host as the original URL (caller enforces).
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Stream the body of a Response into a Buffer with size cap and SHA-256.
|
||||
* Aborts via the AbortController if cap is exceeded.
|
||||
* @param {Response} res
|
||||
* @param {AbortController} controller
|
||||
* @returns {Promise<{ buffer: Buffer, sha256: string, size: number }>}
|
||||
*/
|
||||
async function readBodyCapped(res, controller) {
|
||||
if (!res.body) throw new Error('response has no body');
|
||||
const hash = createHash('sha256');
|
||||
const chunks = [];
|
||||
let size = 0;
|
||||
const reader = res.body.getReader();
|
||||
// eslint-disable-next-line no-constant-condition
|
||||
while (true) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) break;
|
||||
if (!value) continue;
|
||||
size += value.byteLength;
|
||||
if (size > MAX_VSIX_BYTES) {
|
||||
try { controller.abort(); } catch {}
|
||||
throw new Error(`VSIX exceeds maximum size (${MAX_VSIX_BYTES} bytes)`);
|
||||
}
|
||||
hash.update(value);
|
||||
chunks.push(Buffer.from(value));
|
||||
}
|
||||
return { buffer: Buffer.concat(chunks), sha256: hash.digest('hex'), size };
|
||||
}
|
||||
|
||||
async function httpsFetch(url, init, originalType) {
|
||||
const u = new URL(url);
|
||||
if (u.protocol !== 'https:') {
|
||||
throw new Error(`refusing non-HTTPS URL: ${url}`);
|
||||
}
|
||||
if (!isAllowedHost(u.hostname, originalType)) {
|
||||
throw new Error(`refusing redirect to disallowed host: ${u.hostname}`);
|
||||
}
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), FETCH_TIMEOUT_MS);
|
||||
try {
|
||||
const res = await fetch(url, {
|
||||
...init,
|
||||
signal: controller.signal,
|
||||
// Manual redirect handling so we can validate every hop.
|
||||
redirect: 'manual',
|
||||
});
|
||||
if (res.status >= 300 && res.status < 400) {
|
||||
const loc = res.headers.get('location');
|
||||
if (!loc) throw new Error(`HTTP ${res.status} without Location header`);
|
||||
const next = new URL(loc, url).toString();
|
||||
// Cap redirect depth via init counter.
|
||||
const depth = (init && init.__depth) || 0;
|
||||
if (depth >= 5) throw new Error('too many redirects');
|
||||
return httpsFetch(next, { ...init, __depth: depth + 1, method: 'GET', body: undefined }, originalType);
|
||||
}
|
||||
if (!res.ok) {
|
||||
throw new Error(`HTTP ${res.status} ${res.statusText} for ${url}`);
|
||||
}
|
||||
const out = await readBodyCapped(res, controller);
|
||||
return { ...out, finalUrl: url };
|
||||
} finally {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch a VSIX from the VS Code Marketplace by publisher.name.
|
||||
* Uses the undocumented but stable gallery API:
|
||||
* POST https://marketplace.visualstudio.com/_apis/public/gallery/extensionquery
|
||||
* The response includes a download URL; we then GET that.
|
||||
* Falls back to the well-known direct URL pattern if extensionquery is not usable.
|
||||
*
|
||||
* @param {string} publisher
|
||||
* @param {string} name
|
||||
* @returns {Promise<{ buffer: Buffer, sha256: string, size: number, finalUrl: string, source: object }>}
|
||||
*/
|
||||
export async function fetchMarketplaceVsix(publisher, name) {
|
||||
// Direct download URL pattern (well-known, used by `vsce` and `code` itself):
|
||||
// https://{publisher}.gallery.vsassets.io/_apis/public/gallery/publisher/{publisher}/extension/{name}/latest/assetbyname/Microsoft.VisualStudio.Services.VSIXPackage
|
||||
const directUrl =
|
||||
`https://${encodeURIComponent(publisher)}.gallery.vsassets.io` +
|
||||
`/_apis/public/gallery/publisher/${encodeURIComponent(publisher)}` +
|
||||
`/extension/${encodeURIComponent(name)}/latest/assetbyname/Microsoft.VisualStudio.Services.VSIXPackage`;
|
||||
|
||||
const out = await httpsFetch(directUrl, { method: 'GET' }, 'marketplace');
|
||||
return {
|
||||
...out,
|
||||
source: { type: 'marketplace', publisher, name, requestedUrl: directUrl },
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch a VSIX from OpenVSX. If version is omitted, hits the "latest" endpoint to resolve.
|
||||
* Direct file pattern:
|
||||
* https://open-vsx.org/api/{pub}/{name}/{version}/file/{pub}.{name}-{version}.vsix
|
||||
* Without version we hit:
|
||||
* https://open-vsx.org/api/{pub}/{name}/latest
|
||||
* to resolve, then download.
|
||||
*
|
||||
* @param {string} publisher
|
||||
* @param {string} name
|
||||
* @param {string|null} version
|
||||
*/
|
||||
export async function fetchOpenVsxVsix(publisher, name, version) {
|
||||
let resolvedVersion = version;
|
||||
if (!resolvedVersion) {
|
||||
const meta = await httpsFetch(
|
||||
`https://open-vsx.org/api/${encodeURIComponent(publisher)}/${encodeURIComponent(name)}/latest`,
|
||||
{ method: 'GET', headers: { Accept: 'application/json' } },
|
||||
'openvsx',
|
||||
);
|
||||
let info;
|
||||
try { info = JSON.parse(meta.buffer.toString('utf8')); }
|
||||
catch { throw new Error('OpenVSX returned non-JSON metadata'); }
|
||||
if (!info || typeof info.version !== 'string') {
|
||||
throw new Error('OpenVSX metadata missing version');
|
||||
}
|
||||
resolvedVersion = info.version;
|
||||
}
|
||||
|
||||
const url =
|
||||
`https://open-vsx.org/api/${encodeURIComponent(publisher)}/${encodeURIComponent(name)}` +
|
||||
`/${encodeURIComponent(resolvedVersion)}/file/` +
|
||||
`${encodeURIComponent(publisher)}.${encodeURIComponent(name)}-${encodeURIComponent(resolvedVersion)}.vsix`;
|
||||
|
||||
const out = await httpsFetch(url, { method: 'GET' }, 'openvsx');
|
||||
return {
|
||||
...out,
|
||||
source: { type: 'openvsx', publisher, name, version: resolvedVersion, requestedUrl: url },
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch a VSIX from a direct URL.
|
||||
* @param {string} url
|
||||
*/
|
||||
export async function fetchDirectVsix(url) {
|
||||
const u = new URL(url);
|
||||
if (u.protocol !== 'https:') {
|
||||
throw new Error('direct VSIX URL must be HTTPS');
|
||||
}
|
||||
// Track host so redirects must stay on the same registered host.
|
||||
const sourceHost = u.hostname;
|
||||
const out = await httpsFetchSameHost(url, sourceHost);
|
||||
return {
|
||||
...out,
|
||||
source: { type: 'vsix', requestedUrl: url },
|
||||
};
|
||||
}
|
||||
|
||||
async function httpsFetchSameHost(url, sourceHost) {
|
||||
const u = new URL(url);
|
||||
if (u.protocol !== 'https:') {
|
||||
throw new Error(`refusing non-HTTPS URL: ${url}`);
|
||||
}
|
||||
if (u.hostname !== sourceHost) {
|
||||
throw new Error(`refusing cross-host redirect: ${u.hostname} != ${sourceHost}`);
|
||||
}
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), FETCH_TIMEOUT_MS);
|
||||
try {
|
||||
const res = await fetch(url, { signal: controller.signal, redirect: 'manual' });
|
||||
if (res.status >= 300 && res.status < 400) {
|
||||
const loc = res.headers.get('location');
|
||||
if (!loc) throw new Error(`HTTP ${res.status} without Location header`);
|
||||
const next = new URL(loc, url).toString();
|
||||
return httpsFetchSameHost(next, sourceHost);
|
||||
}
|
||||
if (!res.ok) throw new Error(`HTTP ${res.status} ${res.statusText} for ${url}`);
|
||||
const out = await readBodyCapped(res, controller);
|
||||
return { ...out, finalUrl: url };
|
||||
} finally {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* High-level dispatch. Detects URL type and returns a fetched VSIX.
|
||||
* @param {string} url
|
||||
* @returns {Promise<{ buffer: Buffer, sha256: string, size: number, finalUrl: string, source: object }>}
|
||||
*/
|
||||
export async function fetchVsixFromUrl(url) {
|
||||
const detected = detectUrlType(url);
|
||||
switch (detected.type) {
|
||||
case 'marketplace':
|
||||
return fetchMarketplaceVsix(detected.publisher, detected.name);
|
||||
case 'openvsx':
|
||||
return fetchOpenVsxVsix(detected.publisher, detected.name, detected.version);
|
||||
case 'vsix':
|
||||
return fetchDirectVsix(url);
|
||||
case 'github':
|
||||
throw new Error('GitHub repo URLs are not supported in v6.4.0 (would require build step). Use Marketplace, OpenVSX, or a direct .vsix URL.');
|
||||
default:
|
||||
throw new Error(`unsupported URL: ${url}`);
|
||||
}
|
||||
}
|
||||
|
||||
export const __testing = {
|
||||
MAX_VSIX_BYTES,
|
||||
FETCH_TIMEOUT_MS,
|
||||
isAllowedHost,
|
||||
readBodyCapped,
|
||||
};
|
||||
374
plugins/llm-security/scanners/lib/zip-extract.mjs
Normal file
374
plugins/llm-security/scanners/lib/zip-extract.mjs
Normal file
|
|
@ -0,0 +1,374 @@
|
|||
// zip-extract.mjs — Zero-dependency ZIP parser and safe extractor for VSIX files.
|
||||
//
|
||||
// VSIX is a plain ZIP container. We implement the minimum subset needed to:
|
||||
// - Parse the End of Central Directory (EOCD)
|
||||
// - Walk Central Directory headers to enumerate entries
|
||||
// - Read each Local File Header and inflate its data with node:zlib
|
||||
// - Validate every entry name BEFORE creating any file (zip-slip, absolute, symlinks)
|
||||
// - Enforce caps: total entries, total uncompressed bytes, expansion ratio, depth
|
||||
//
|
||||
// Compression methods supported:
|
||||
// 0 STORE (no compression)
|
||||
// 8 DEFLATE (raw deflate, via zlib.createInflateRaw)
|
||||
// Anything else is rejected.
|
||||
//
|
||||
// Spec: https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT
|
||||
//
|
||||
// SECURITY NOTES
|
||||
// - We resolve every entry path inside `targetDir` and require startsWith(targetDir + sep)
|
||||
// - We reject absolute paths, drive letters, NUL bytes, and `..` segments after normalization
|
||||
// - We reject entries whose external_attr indicates a symbolic link (0xA000 in upper word, Unix)
|
||||
// - We reject ZIP64 entries (we don't claim to support them; >4GB is out of scope for VSIX)
|
||||
// - We reject entries with general purpose bit 0 (encryption) set
|
||||
// - Directories are not extracted from entries — created on-demand from file paths
|
||||
|
||||
import { createInflateRaw } from 'node:zlib';
|
||||
import { mkdir, writeFile } from 'node:fs/promises';
|
||||
import { dirname, join, resolve, sep, normalize } from 'node:path';
|
||||
|
||||
const SIG_LFH = 0x04034b50; // Local File Header
|
||||
const SIG_CD = 0x02014b50; // Central Directory
|
||||
const SIG_EOCD = 0x06054b50; // End of Central Directory
|
||||
const SIG_EOCD64_LOC = 0x07064b50; // ZIP64 EOCD locator (presence => reject)
|
||||
const SIG_EOCD64 = 0x06064b50; // ZIP64 EOCD record
|
||||
|
||||
const DEFAULT_CAPS = Object.freeze({
|
||||
maxEntries: 10_000,
|
||||
maxUncompressedBytes: 500 * 1024 * 1024, // 500MB
|
||||
maxExpansionRatio: 100, // sum uncompressed / sum compressed
|
||||
maxDepth: 20,
|
||||
});
|
||||
|
||||
class ZipError extends Error {
|
||||
constructor(message, code = 'ZIP_INVALID') {
|
||||
super(message);
|
||||
this.code = code;
|
||||
this.name = 'ZipError';
|
||||
}
|
||||
}
|
||||
|
||||
function readU16(buf, off) { return buf.readUInt16LE(off); }
|
||||
function readU32(buf, off) { return buf.readUInt32LE(off); }
|
||||
|
||||
/**
|
||||
* Locate the End of Central Directory (EOCD) by scanning backwards from end of buffer.
|
||||
* EOCD is 22 bytes minimum; comment can extend it up to 65557 bytes total.
|
||||
*/
|
||||
function findEOCD(buf) {
|
||||
const minOff = Math.max(0, buf.length - 22 - 0xFFFF);
|
||||
for (let off = buf.length - 22; off >= minOff; off--) {
|
||||
if (readU32(buf, off) === SIG_EOCD) {
|
||||
// Sanity: comment length must fit
|
||||
const commentLen = readU16(buf, off + 20);
|
||||
if (off + 22 + commentLen === buf.length) return off;
|
||||
}
|
||||
}
|
||||
throw new ZipError('EOCD signature not found', 'ZIP_NO_EOCD');
|
||||
}
|
||||
|
||||
function parseEOCD(buf, off) {
|
||||
return {
|
||||
diskNumber: readU16(buf, off + 4),
|
||||
cdDisk: readU16(buf, off + 6),
|
||||
cdEntriesOnDisk: readU16(buf, off + 8),
|
||||
cdEntriesTotal: readU16(buf, off + 10),
|
||||
cdSize: readU32(buf, off + 12),
|
||||
cdOffset: readU32(buf, off + 16),
|
||||
commentLength: readU16(buf, off + 20),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse central directory entries and return a structured list.
|
||||
* @param {Buffer} buf
|
||||
* @returns {Array}
|
||||
*/
|
||||
function parseCentralDirectory(buf) {
|
||||
// Reject ZIP64 (we do not implement it).
|
||||
// Look for ZIP64 EOCD locator in the 20 bytes immediately before EOCD.
|
||||
const eocdOff = findEOCD(buf);
|
||||
if (eocdOff >= 20 && readU32(buf, eocdOff - 20) === SIG_EOCD64_LOC) {
|
||||
throw new ZipError('ZIP64 archives are not supported', 'ZIP_ZIP64');
|
||||
}
|
||||
|
||||
const eocd = parseEOCD(buf, eocdOff);
|
||||
if (eocd.diskNumber !== 0 || eocd.cdDisk !== 0) {
|
||||
throw new ZipError('multi-disk archives are not supported', 'ZIP_MULTIDISK');
|
||||
}
|
||||
if (eocd.cdEntriesOnDisk !== eocd.cdEntriesTotal) {
|
||||
throw new ZipError('split central directory not supported', 'ZIP_SPLIT_CD');
|
||||
}
|
||||
if (eocd.cdOffset === 0xFFFFFFFF || eocd.cdSize === 0xFFFFFFFF) {
|
||||
throw new ZipError('ZIP64 fields detected', 'ZIP_ZIP64');
|
||||
}
|
||||
if (eocd.cdOffset + eocd.cdSize > buf.length) {
|
||||
throw new ZipError('central directory extends past EOF', 'ZIP_BAD_CD_OFFSET');
|
||||
}
|
||||
|
||||
const entries = [];
|
||||
let p = eocd.cdOffset;
|
||||
for (let i = 0; i < eocd.cdEntriesTotal; i++) {
|
||||
if (p + 46 > buf.length) throw new ZipError('truncated central directory', 'ZIP_TRUNCATED_CD');
|
||||
if (readU32(buf, p) !== SIG_CD) throw new ZipError('bad central directory signature', 'ZIP_BAD_CD_SIG');
|
||||
|
||||
const versionMadeBy = readU16(buf, p + 4);
|
||||
const generalFlags = readU16(buf, p + 8);
|
||||
const method = readU16(buf, p + 10);
|
||||
const crc32 = readU32(buf, p + 16);
|
||||
const compSize = readU32(buf, p + 20);
|
||||
const uncompSize = readU32(buf, p + 24);
|
||||
const nameLen = readU16(buf, p + 28);
|
||||
const extraLen = readU16(buf, p + 30);
|
||||
const commentLen = readU16(buf, p + 32);
|
||||
const externalAttr = readU32(buf, p + 38);
|
||||
const lfhOffset = readU32(buf, p + 42);
|
||||
|
||||
if (compSize === 0xFFFFFFFF || uncompSize === 0xFFFFFFFF || lfhOffset === 0xFFFFFFFF) {
|
||||
throw new ZipError('ZIP64 fields detected in entry', 'ZIP_ZIP64');
|
||||
}
|
||||
|
||||
const nameStart = p + 46;
|
||||
if (nameStart + nameLen > buf.length) throw new ZipError('entry name extends past EOF', 'ZIP_BAD_NAME');
|
||||
const rawName = buf.slice(nameStart, nameStart + nameLen).toString('utf8');
|
||||
|
||||
entries.push({
|
||||
versionMadeBy,
|
||||
generalFlags,
|
||||
method,
|
||||
crc32,
|
||||
compSize,
|
||||
uncompSize,
|
||||
nameLen,
|
||||
extraLen,
|
||||
commentLen,
|
||||
externalAttr,
|
||||
lfhOffset,
|
||||
name: rawName,
|
||||
});
|
||||
|
||||
p += 46 + nameLen + extraLen + commentLen;
|
||||
}
|
||||
return entries;
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate an entry name and return a relative path safe to join with targetDir.
|
||||
* Throws ZipError on any rejected pattern. Returns null for entries that should be skipped (directory entries).
|
||||
*/
|
||||
function validateEntryName(rawName, caps) {
|
||||
if (rawName.length === 0) throw new ZipError('empty entry name', 'ZIP_BAD_NAME');
|
||||
if (rawName.length > 1024) throw new ZipError('entry name exceeds 1024 chars', 'ZIP_BAD_NAME');
|
||||
if (rawName.includes('\u0000')) throw new ZipError('NUL byte in entry name', 'ZIP_BAD_NAME');
|
||||
|
||||
// Directory entries end with '/' — skip; we'll create dirs as needed for files.
|
||||
if (rawName.endsWith('/')) return null;
|
||||
|
||||
// Reject Windows absolute paths (drive letters, UNC) and POSIX absolute paths.
|
||||
if (rawName.startsWith('/') || rawName.startsWith('\\')) {
|
||||
throw new ZipError(`absolute entry path: ${rawName}`, 'ZIP_ABSOLUTE');
|
||||
}
|
||||
if (/^[A-Za-z]:[\\/]/.test(rawName)) {
|
||||
throw new ZipError(`drive-letter entry path: ${rawName}`, 'ZIP_ABSOLUTE');
|
||||
}
|
||||
|
||||
// Normalize: convert backslashes to forward slashes (some zip tools emit \).
|
||||
const unixName = rawName.replace(/\\/g, '/');
|
||||
|
||||
// Reject any path component equal to '..' — even after normalization it must escape.
|
||||
const parts = unixName.split('/');
|
||||
for (const part of parts) {
|
||||
if (part === '..') {
|
||||
throw new ZipError(`parent traversal in entry: ${rawName}`, 'ZIP_TRAVERSAL');
|
||||
}
|
||||
}
|
||||
|
||||
// Final path normalization (collapses '.' segments, NFC).
|
||||
const normalized = normalize(unixName).normalize('NFC');
|
||||
if (normalized.startsWith('..') || normalized.includes(`${sep}..${sep}`) || normalized === '..') {
|
||||
throw new ZipError(`parent traversal after normalization: ${rawName}`, 'ZIP_TRAVERSAL');
|
||||
}
|
||||
if (normalized.split(sep).length > caps.maxDepth) {
|
||||
throw new ZipError(`entry path exceeds depth ${caps.maxDepth}: ${rawName}`, 'ZIP_DEEP');
|
||||
}
|
||||
|
||||
return normalized;
|
||||
}
|
||||
|
||||
/**
|
||||
* Detect symlink entries from the external_attr field.
|
||||
* For Unix-made entries (versionMadeBy upper byte = 3), the high 16 bits are mode bits.
|
||||
* Symlink mode is 0xA000.
|
||||
*/
|
||||
function isSymlink(entry) {
|
||||
const madeByOs = (entry.versionMadeBy >>> 8) & 0xFF;
|
||||
if (madeByOs !== 3) return false; // Only Unix encodes mode bits
|
||||
const mode = (entry.externalAttr >>> 16) & 0xFFFF;
|
||||
return (mode & 0xF000) === 0xA000;
|
||||
}
|
||||
|
||||
/**
|
||||
* Inflate raw deflate bytes via node:zlib, with a hard upper bound on output size.
|
||||
* Aborts as soon as output exceeds maxBytes (zip-bomb defense).
|
||||
*/
|
||||
function inflateBounded(input, expectedSize, maxBytes) {
|
||||
return new Promise((resolvePromise, reject) => {
|
||||
const stream = createInflateRaw();
|
||||
const chunks = [];
|
||||
let total = 0;
|
||||
let aborted = false;
|
||||
|
||||
stream.on('data', chunk => {
|
||||
if (aborted) return;
|
||||
total += chunk.length;
|
||||
if (total > maxBytes) {
|
||||
aborted = true;
|
||||
stream.destroy(new ZipError(`inflate exceeds cap (${maxBytes} bytes)`, 'ZIP_BOMB'));
|
||||
return;
|
||||
}
|
||||
chunks.push(chunk);
|
||||
});
|
||||
stream.on('end', () => {
|
||||
if (aborted) return;
|
||||
if (total !== expectedSize) {
|
||||
reject(new ZipError(
|
||||
`inflated size ${total} does not match expected ${expectedSize}`,
|
||||
'ZIP_SIZE_MISMATCH',
|
||||
));
|
||||
return;
|
||||
}
|
||||
resolvePromise(Buffer.concat(chunks, total));
|
||||
});
|
||||
stream.on('error', err => {
|
||||
if (err instanceof ZipError) reject(err);
|
||||
else reject(new ZipError(`inflate failed: ${err.message}`, 'ZIP_INFLATE'));
|
||||
});
|
||||
|
||||
stream.end(input);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Read the data section of one entry given its central directory record.
|
||||
* @param {Buffer} buf
|
||||
* @param {object} entry
|
||||
* @param {number} remainingBudget - max bytes we may still inflate this archive
|
||||
* @returns {Promise<Buffer>}
|
||||
*/
|
||||
async function readEntryData(buf, entry, remainingBudget) {
|
||||
const lfhOff = entry.lfhOffset;
|
||||
if (lfhOff + 30 > buf.length) throw new ZipError('LFH past EOF', 'ZIP_BAD_LFH');
|
||||
if (readU32(buf, lfhOff) !== SIG_LFH) throw new ZipError('bad LFH signature', 'ZIP_BAD_LFH');
|
||||
|
||||
const lfhNameLen = readU16(buf, lfhOff + 26);
|
||||
const lfhExtraLen = readU16(buf, lfhOff + 28);
|
||||
const dataStart = lfhOff + 30 + lfhNameLen + lfhExtraLen;
|
||||
const dataEnd = dataStart + entry.compSize;
|
||||
if (dataEnd > buf.length) throw new ZipError('entry data past EOF', 'ZIP_BAD_DATA');
|
||||
|
||||
const compressed = buf.slice(dataStart, dataEnd);
|
||||
const cap = Math.min(entry.uncompSize, remainingBudget);
|
||||
|
||||
if (entry.method === 0) {
|
||||
if (entry.compSize !== entry.uncompSize) {
|
||||
throw new ZipError('STORED entry compSize != uncompSize', 'ZIP_BAD_STORED');
|
||||
}
|
||||
if (entry.uncompSize > remainingBudget) {
|
||||
throw new ZipError('STORED entry exceeds budget', 'ZIP_BOMB');
|
||||
}
|
||||
return compressed;
|
||||
}
|
||||
if (entry.method === 8) {
|
||||
return inflateBounded(compressed, entry.uncompSize, cap);
|
||||
}
|
||||
throw new ZipError(`unsupported compression method ${entry.method}`, 'ZIP_BAD_METHOD');
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract a VSIX/ZIP buffer to targetDir with all caps and validations applied.
|
||||
* targetDir must already exist or be creatable; caller owns cleanup.
|
||||
*
|
||||
* @param {Buffer} buf - The full ZIP buffer
|
||||
* @param {string} targetDir - Absolute path to extract into
|
||||
* @param {object} [opts]
|
||||
* @param {object} [opts.caps] - Override default caps
|
||||
* @returns {Promise<{ entries: number, bytes: number, files: string[] }>}
|
||||
*/
|
||||
export async function extractToDir(buf, targetDir, opts = {}) {
|
||||
const caps = { ...DEFAULT_CAPS, ...(opts.caps || {}) };
|
||||
const absTarget = resolve(targetDir);
|
||||
await mkdir(absTarget, { recursive: true });
|
||||
|
||||
const entries = parseCentralDirectory(buf);
|
||||
if (entries.length > caps.maxEntries) {
|
||||
throw new ZipError(`too many entries (${entries.length} > ${caps.maxEntries})`, 'ZIP_TOO_MANY_ENTRIES');
|
||||
}
|
||||
|
||||
let totalUncomp = 0;
|
||||
let totalComp = 0;
|
||||
const files = [];
|
||||
|
||||
for (const entry of entries) {
|
||||
if (entry.generalFlags & 0x0001) {
|
||||
throw new ZipError(`encrypted entry not allowed: ${entry.name}`, 'ZIP_ENCRYPTED');
|
||||
}
|
||||
if (isSymlink(entry)) {
|
||||
throw new ZipError(`symlink entry not allowed: ${entry.name}`, 'ZIP_SYMLINK');
|
||||
}
|
||||
|
||||
const safeName = validateEntryName(entry.name, caps);
|
||||
if (safeName === null) continue; // directory entry
|
||||
|
||||
const fullPath = join(absTarget, safeName);
|
||||
const resolved = resolve(fullPath);
|
||||
if (resolved !== absTarget && !resolved.startsWith(absTarget + sep)) {
|
||||
throw new ZipError(`zip-slip: ${entry.name} resolves outside target`, 'ZIP_TRAVERSAL');
|
||||
}
|
||||
|
||||
if (entry.uncompSize > caps.maxUncompressedBytes) {
|
||||
throw new ZipError(`entry ${entry.name} exceeds maxUncompressedBytes`, 'ZIP_BOMB');
|
||||
}
|
||||
const remaining = caps.maxUncompressedBytes - totalUncomp;
|
||||
const data = await readEntryData(buf, entry, remaining);
|
||||
|
||||
totalUncomp += data.length;
|
||||
totalComp += Math.max(1, entry.compSize); // avoid div-by-zero in ratio
|
||||
if (totalUncomp > caps.maxUncompressedBytes) {
|
||||
throw new ZipError(`total uncompressed exceeds cap`, 'ZIP_BOMB');
|
||||
}
|
||||
if (totalUncomp / totalComp > caps.maxExpansionRatio) {
|
||||
throw new ZipError(
|
||||
`expansion ratio exceeds ${caps.maxExpansionRatio}x (${totalUncomp}/${totalComp})`,
|
||||
'ZIP_BOMB',
|
||||
);
|
||||
}
|
||||
|
||||
await mkdir(dirname(resolved), { recursive: true });
|
||||
await writeFile(resolved, data);
|
||||
files.push(safeName);
|
||||
}
|
||||
|
||||
return { entries: files.length, bytes: totalUncomp, files };
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse central directory only — no extraction. Useful for inspection / tests.
|
||||
*/
|
||||
export function listEntries(buf) {
|
||||
return parseCentralDirectory(buf).map(e => ({
|
||||
name: e.name,
|
||||
method: e.method,
|
||||
compSize: e.compSize,
|
||||
uncompSize: e.uncompSize,
|
||||
isDir: e.name.endsWith('/'),
|
||||
isSymlink: isSymlink(e),
|
||||
}));
|
||||
}
|
||||
|
||||
export { ZipError };
|
||||
export const __testing = {
|
||||
DEFAULT_CAPS,
|
||||
validateEntryName,
|
||||
isSymlink,
|
||||
parseCentralDirectory,
|
||||
inflateBounded,
|
||||
};
|
||||
97
plugins/llm-security/tests/lib/build-zip.mjs
Normal file
97
plugins/llm-security/tests/lib/build-zip.mjs
Normal file
|
|
@ -0,0 +1,97 @@
|
|||
// build-zip.mjs — Minimal synthetic ZIP builder for tests.
|
||||
// Supports STORE method only. Lets tests construct adversarial archives that
|
||||
// real zip tools refuse to emit (zip-slip names, symlink mode bits, oversized
|
||||
// uncompressed sizes for bomb tests).
|
||||
|
||||
import { crc32 } from 'node:zlib';
|
||||
|
||||
const SIG_LFH = 0x04034b50;
|
||||
const SIG_CD = 0x02014b50;
|
||||
const SIG_EOCD = 0x06054b50;
|
||||
|
||||
function crc(buf) {
|
||||
return crc32(buf) >>> 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build a ZIP buffer from a list of entries.
|
||||
* @param {Array<{ name: string, data: Buffer|string, externalAttr?: number, versionMadeBy?: number, declaredUncompSize?: number, declaredCompSize?: number }>} entries
|
||||
* @returns {Buffer}
|
||||
*/
|
||||
export function buildZip(entries) {
|
||||
const lfhParts = [];
|
||||
const cdParts = [];
|
||||
let offset = 0;
|
||||
|
||||
for (const entry of entries) {
|
||||
const nameBuf = Buffer.from(entry.name, 'utf8');
|
||||
const data = Buffer.isBuffer(entry.data) ? entry.data : Buffer.from(entry.data || '', 'utf8');
|
||||
const compSize = entry.declaredCompSize ?? data.length;
|
||||
const uncompSize = entry.declaredUncompSize ?? data.length;
|
||||
const c = crc(data);
|
||||
|
||||
// Local file header (30 bytes)
|
||||
const lfh = Buffer.alloc(30);
|
||||
lfh.writeUInt32LE(SIG_LFH, 0);
|
||||
lfh.writeUInt16LE(20, 4); // version needed
|
||||
lfh.writeUInt16LE(0, 6); // flags
|
||||
lfh.writeUInt16LE(0, 8); // method = STORE
|
||||
lfh.writeUInt16LE(0, 10); // time
|
||||
lfh.writeUInt16LE(0, 12); // date
|
||||
lfh.writeUInt32LE(c, 14); // crc32
|
||||
lfh.writeUInt32LE(compSize, 18); // compressed size
|
||||
lfh.writeUInt32LE(uncompSize, 22); // uncompressed size
|
||||
lfh.writeUInt16LE(nameBuf.length, 26);
|
||||
lfh.writeUInt16LE(0, 28); // extra len
|
||||
|
||||
lfhParts.push(lfh, nameBuf, data);
|
||||
const thisLfhOffset = offset;
|
||||
offset += lfh.length + nameBuf.length + data.length;
|
||||
|
||||
// Central directory header (46 bytes)
|
||||
const cd = Buffer.alloc(46);
|
||||
cd.writeUInt32LE(SIG_CD, 0);
|
||||
cd.writeUInt16LE(entry.versionMadeBy ?? (3 << 8) | 20, 4); // OS=Unix(3), version=20
|
||||
cd.writeUInt16LE(20, 6);
|
||||
cd.writeUInt16LE(0, 8);
|
||||
cd.writeUInt16LE(0, 10);
|
||||
cd.writeUInt16LE(0, 12);
|
||||
cd.writeUInt16LE(0, 14);
|
||||
cd.writeUInt32LE(c, 16);
|
||||
cd.writeUInt32LE(compSize, 20);
|
||||
cd.writeUInt32LE(uncompSize, 24);
|
||||
cd.writeUInt16LE(nameBuf.length, 28);
|
||||
cd.writeUInt16LE(0, 30);
|
||||
cd.writeUInt16LE(0, 32); // comment len
|
||||
cd.writeUInt16LE(0, 34); // disk start
|
||||
cd.writeUInt16LE(0, 36); // internal attrs
|
||||
cd.writeUInt32LE((entry.externalAttr ?? 0) >>> 0, 38); // external attrs (unsigned)
|
||||
cd.writeUInt32LE(thisLfhOffset, 42);
|
||||
|
||||
cdParts.push(cd, nameBuf);
|
||||
}
|
||||
|
||||
const lfhSection = Buffer.concat(lfhParts);
|
||||
const cdSection = Buffer.concat(cdParts);
|
||||
const cdOffset = lfhSection.length;
|
||||
const cdSize = cdSection.length;
|
||||
|
||||
const eocd = Buffer.alloc(22);
|
||||
eocd.writeUInt32LE(SIG_EOCD, 0);
|
||||
eocd.writeUInt16LE(0, 4);
|
||||
eocd.writeUInt16LE(0, 6);
|
||||
eocd.writeUInt16LE(entries.length, 8);
|
||||
eocd.writeUInt16LE(entries.length, 10);
|
||||
eocd.writeUInt32LE(cdSize, 12);
|
||||
eocd.writeUInt32LE(cdOffset, 16);
|
||||
eocd.writeUInt16LE(0, 20);
|
||||
|
||||
return Buffer.concat([lfhSection, cdSection, eocd]);
|
||||
}
|
||||
|
||||
/** Convenience: produce a unix mode in the upper 16 bits of externalAttr. */
|
||||
export function unixModeAttr(mode) {
|
||||
return (mode & 0xFFFF) << 16;
|
||||
}
|
||||
|
||||
export const MODE_SYMLINK = 0xA1FF; // S_IFLNK | rwxrwxrwx
|
||||
145
plugins/llm-security/tests/scanners/ide-extension-url.test.mjs
Normal file
145
plugins/llm-security/tests/scanners/ide-extension-url.test.mjs
Normal file
|
|
@ -0,0 +1,145 @@
|
|||
// ide-extension-url.test.mjs — Integration tests for `/security ide-scan <url>`.
|
||||
// Mocks global.fetch so we never hit real Marketplace / OpenVSX endpoints.
|
||||
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import { resetCounter } from '../../scanners/lib/output.mjs';
|
||||
import { scan } from '../../scanners/ide-extension-scanner.mjs';
|
||||
import { buildZip } from '../lib/build-zip.mjs';
|
||||
|
||||
const realFetch = globalThis.fetch;
|
||||
|
||||
function mockResponse(buffer, { status = 200 } = {}) {
|
||||
const stream = new ReadableStream({
|
||||
start(controller) { controller.enqueue(buffer); controller.close(); },
|
||||
});
|
||||
return new Response(stream, { status, headers: { 'content-type': 'application/octet-stream' } });
|
||||
}
|
||||
|
||||
function jsonResponse(obj) {
|
||||
return new Response(JSON.stringify(obj), {
|
||||
status: 200,
|
||||
headers: { 'content-type': 'application/json' },
|
||||
});
|
||||
}
|
||||
|
||||
function buildBenignVsix() {
|
||||
const pkg = JSON.stringify({
|
||||
publisher: 'anthropic',
|
||||
name: 'claude-code',
|
||||
version: '1.0.0',
|
||||
engines: { vscode: '^1.80.0' },
|
||||
main: './extension.js',
|
||||
activationEvents: ['onCommand:claude.hello'],
|
||||
categories: ['Other'],
|
||||
});
|
||||
return buildZip([
|
||||
{ name: 'extension.vsixmanifest', data: '<PackageManifest></PackageManifest>' },
|
||||
{ name: 'extension/package.json', data: pkg },
|
||||
{ name: 'extension/extension.js', data: 'module.exports = { activate(){} };' },
|
||||
]);
|
||||
}
|
||||
|
||||
function installFetchRouter(routes) {
|
||||
globalThis.fetch = async (url) => {
|
||||
const handler = routes(url);
|
||||
if (!handler) throw new Error(`unrouted fetch: ${url}`);
|
||||
return handler;
|
||||
};
|
||||
}
|
||||
|
||||
describe('ide-extension-scanner — URL mode', () => {
|
||||
before(() => resetCounter());
|
||||
after(() => { globalThis.fetch = realFetch; });
|
||||
|
||||
it('rejects unsupported URL with a warning, no extensions scanned', async () => {
|
||||
installFetchRouter(() => null);
|
||||
const env = await scan('https://example.com/random.zip', { vscodeOnly: true });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /unsupported URL/i.test(w)));
|
||||
assert.equal(env.meta.source, null);
|
||||
});
|
||||
|
||||
it('reports github URL as unsupported in v6.4.0', async () => {
|
||||
installFetchRouter(() => null);
|
||||
const env = await scan('https://github.com/anthropic/claude-code', { vscodeOnly: true });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /GitHub repo URLs/i.test(w)));
|
||||
});
|
||||
|
||||
it('fetches OpenVSX VSIX and scans the extracted extension', async () => {
|
||||
const vsix = buildBenignVsix();
|
||||
let metaCalled = false;
|
||||
let downloadCalled = false;
|
||||
installFetchRouter((url) => {
|
||||
if (url.endsWith('/latest')) {
|
||||
metaCalled = true;
|
||||
return jsonResponse({ version: '1.0.0' });
|
||||
}
|
||||
if (url.includes('/file/') && url.endsWith('.vsix')) {
|
||||
downloadCalled = true;
|
||||
return mockResponse(vsix);
|
||||
}
|
||||
return null;
|
||||
});
|
||||
|
||||
const env = await scan('https://open-vsx.org/extension/anthropic/claude-code', { vscodeOnly: true });
|
||||
assert.ok(metaCalled, 'expected metadata fetch for latest version');
|
||||
assert.ok(downloadCalled, 'expected VSIX download');
|
||||
assert.equal(env.extensions.length, 1);
|
||||
assert.equal(env.extensions[0].id, 'anthropic.claude-code');
|
||||
assert.equal(env.extensions[0].version, '1.0.0');
|
||||
assert.ok(env.meta.source);
|
||||
assert.equal(env.meta.source.type, 'url');
|
||||
assert.equal(env.meta.source.publisher, 'anthropic');
|
||||
assert.equal(env.meta.source.name, 'claude-code');
|
||||
assert.equal(env.meta.source.version, '1.0.0');
|
||||
assert.match(env.meta.source.sha256, /^[a-f0-9]{64}$/);
|
||||
assert.equal(env.meta.target, 'https://open-vsx.org/extension/anthropic/claude-code');
|
||||
});
|
||||
|
||||
it('fetches Marketplace VSIX directly without metadata round-trip', async () => {
|
||||
const vsix = buildBenignVsix();
|
||||
let downloads = 0;
|
||||
installFetchRouter((url) => {
|
||||
if (url.includes('Microsoft.VisualStudio.Services.VSIXPackage')) {
|
||||
downloads++;
|
||||
return mockResponse(vsix);
|
||||
}
|
||||
return null;
|
||||
});
|
||||
|
||||
const env = await scan('https://marketplace.visualstudio.com/items?itemName=anthropic.claude-code', { vscodeOnly: true });
|
||||
assert.equal(downloads, 1);
|
||||
assert.equal(env.extensions.length, 1);
|
||||
assert.equal(env.extensions[0].id, 'anthropic.claude-code');
|
||||
assert.equal(env.meta.source.type, 'url');
|
||||
assert.equal(env.meta.source.requestedUrl?.includes('VSIXPackage'), true);
|
||||
});
|
||||
|
||||
it('cleans up temp dir even when extraction fails', async () => {
|
||||
// Return a non-zip body so extract throws.
|
||||
installFetchRouter(() => mockResponse(Buffer.from('not a zip at all')));
|
||||
const env = await scan('https://example.com/bad.vsix', { vscodeOnly: true });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /malformed VSIX/.test(w)));
|
||||
});
|
||||
|
||||
it('rejects zip-slip VSIX as malformed', async () => {
|
||||
const evil = buildZip([
|
||||
{ name: 'extension/package.json', data: '{}' },
|
||||
{ name: '../escape.txt', data: 'pwned' },
|
||||
]);
|
||||
installFetchRouter(() => mockResponse(evil));
|
||||
const env = await scan('https://example.com/evil.vsix', { vscodeOnly: true });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /malformed VSIX/.test(w) && /traversal/.test(w)));
|
||||
});
|
||||
|
||||
it('handles fetch network failure cleanly', async () => {
|
||||
installFetchRouter(() => { throw new Error('ECONNREFUSED'); });
|
||||
const env = await scan('https://open-vsx.org/extension/foo/bar', { vscodeOnly: true });
|
||||
assert.equal(env.extensions.length, 0);
|
||||
assert.ok(env.meta.warnings.some(w => /URL fetch\/extract failed/.test(w)));
|
||||
});
|
||||
});
|
||||
126
plugins/llm-security/tests/scanners/vsix-fetch.test.mjs
Normal file
126
plugins/llm-security/tests/scanners/vsix-fetch.test.mjs
Normal file
|
|
@ -0,0 +1,126 @@
|
|||
// vsix-fetch.test.mjs — Unit tests for URL detection + body capping.
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import { detectUrlType, __testing } from '../../scanners/lib/vsix-fetch.mjs';
|
||||
|
||||
const { isAllowedHost, readBodyCapped, MAX_VSIX_BYTES } = __testing;
|
||||
|
||||
describe('detectUrlType', () => {
|
||||
it('detects VS Code Marketplace URL', () => {
|
||||
const out = detectUrlType('https://marketplace.visualstudio.com/items?itemName=ms-python.python');
|
||||
assert.equal(out.type, 'marketplace');
|
||||
assert.equal(out.publisher, 'ms-python');
|
||||
assert.equal(out.name, 'python');
|
||||
});
|
||||
|
||||
it('returns unknown for marketplace URL without itemName', () => {
|
||||
const out = detectUrlType('https://marketplace.visualstudio.com/items');
|
||||
assert.equal(out.type, 'unknown');
|
||||
});
|
||||
|
||||
it('returns unknown for marketplace itemName without dot', () => {
|
||||
const out = detectUrlType('https://marketplace.visualstudio.com/items?itemName=foobar');
|
||||
assert.equal(out.type, 'unknown');
|
||||
});
|
||||
|
||||
it('detects OpenVSX URL with version', () => {
|
||||
const out = detectUrlType('https://open-vsx.org/extension/anthropic/claude-code/1.2.3');
|
||||
assert.equal(out.type, 'openvsx');
|
||||
assert.equal(out.publisher, 'anthropic');
|
||||
assert.equal(out.name, 'claude-code');
|
||||
assert.equal(out.version, '1.2.3');
|
||||
});
|
||||
|
||||
it('detects OpenVSX URL without version', () => {
|
||||
const out = detectUrlType('https://open-vsx.org/extension/anthropic/claude-code');
|
||||
assert.equal(out.type, 'openvsx');
|
||||
assert.equal(out.publisher, 'anthropic');
|
||||
assert.equal(out.name, 'claude-code');
|
||||
assert.equal(out.version, null);
|
||||
});
|
||||
|
||||
it('detects direct .vsix download', () => {
|
||||
const out = detectUrlType('https://example.com/path/extension.vsix');
|
||||
assert.equal(out.type, 'vsix');
|
||||
});
|
||||
|
||||
it('detects GitHub URL as github (unsupported)', () => {
|
||||
const out = detectUrlType('https://github.com/anthropic/claude-code');
|
||||
assert.equal(out.type, 'github');
|
||||
});
|
||||
|
||||
it('rejects plain HTTP', () => {
|
||||
const out = detectUrlType('http://marketplace.visualstudio.com/items?itemName=ms-python.python');
|
||||
assert.equal(out.type, 'unknown');
|
||||
});
|
||||
|
||||
it('returns unknown for malformed URL', () => {
|
||||
const out = detectUrlType('not a url');
|
||||
assert.equal(out.type, 'unknown');
|
||||
});
|
||||
|
||||
it('returns unknown for unrelated HTTPS URL', () => {
|
||||
const out = detectUrlType('https://example.com/somefile.zip');
|
||||
assert.equal(out.type, 'unknown');
|
||||
});
|
||||
});
|
||||
|
||||
describe('isAllowedHost', () => {
|
||||
it('allows marketplace gallery cdn for marketplace fetches', () => {
|
||||
assert.equal(isAllowedHost('foo.gallerycdn.vsassets.io', 'marketplace'), true);
|
||||
assert.equal(isAllowedHost('marketplace.visualstudio.com', 'marketplace'), true);
|
||||
});
|
||||
|
||||
it('rejects unrelated host for marketplace fetches', () => {
|
||||
assert.equal(isAllowedHost('evil.example.com', 'marketplace'), false);
|
||||
});
|
||||
|
||||
it('allows openvsx blob storage', () => {
|
||||
assert.equal(isAllowedHost('open-vsx.org', 'openvsx'), true);
|
||||
assert.equal(isAllowedHost('openvsxorg.blob.core.windows.net', 'openvsx'), true);
|
||||
});
|
||||
|
||||
it('rejects unrelated host for openvsx fetches', () => {
|
||||
assert.equal(isAllowedHost('evil.example.com', 'openvsx'), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('readBodyCapped', () => {
|
||||
function makeStreamResponse(chunks) {
|
||||
const stream = new ReadableStream({
|
||||
start(controller) {
|
||||
for (const chunk of chunks) controller.enqueue(chunk);
|
||||
controller.close();
|
||||
},
|
||||
});
|
||||
return new Response(stream);
|
||||
}
|
||||
|
||||
it('reads small body fully and computes SHA-256', async () => {
|
||||
const data = new TextEncoder().encode('hello world');
|
||||
const res = makeStreamResponse([data]);
|
||||
const ctrl = new AbortController();
|
||||
const out = await readBodyCapped(res, ctrl);
|
||||
assert.equal(out.size, 11);
|
||||
assert.equal(out.buffer.toString('utf8'), 'hello world');
|
||||
// sha256("hello world")
|
||||
assert.equal(out.sha256, 'b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9');
|
||||
});
|
||||
|
||||
it('aborts when body exceeds MAX_VSIX_BYTES', async () => {
|
||||
// Stream a small chunk repeated such that total > cap.
|
||||
const chunkSize = 1024 * 1024;
|
||||
const chunk = new Uint8Array(chunkSize);
|
||||
const totalChunks = Math.ceil(MAX_VSIX_BYTES / chunkSize) + 2; // overshoot
|
||||
const stream = new ReadableStream({
|
||||
async start(controller) {
|
||||
for (let i = 0; i < totalChunks; i++) controller.enqueue(chunk);
|
||||
controller.close();
|
||||
},
|
||||
});
|
||||
const res = new Response(stream);
|
||||
const ctrl = new AbortController();
|
||||
await assert.rejects(() => readBodyCapped(res, ctrl), /exceeds maximum size/);
|
||||
});
|
||||
});
|
||||
267
plugins/llm-security/tests/scanners/zip-extract.test.mjs
Normal file
267
plugins/llm-security/tests/scanners/zip-extract.test.mjs
Normal file
|
|
@ -0,0 +1,267 @@
|
|||
// zip-extract.test.mjs — Unit tests for the zero-dep ZIP extractor.
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import { mkdtemp, rm, readFile, readdir } from 'node:fs/promises';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { join } from 'node:path';
|
||||
import { deflateRawSync } from 'node:zlib';
|
||||
import { extractToDir, listEntries, ZipError, __testing } from '../../scanners/lib/zip-extract.mjs';
|
||||
import { buildZip, unixModeAttr, MODE_SYMLINK } from '../lib/build-zip.mjs';
|
||||
|
||||
const { validateEntryName, isSymlink, DEFAULT_CAPS } = __testing;
|
||||
|
||||
async function withTempDir(fn) {
|
||||
const dir = await mkdtemp(join(tmpdir(), 'zip-test-'));
|
||||
try { return await fn(dir); }
|
||||
finally { await rm(dir, { recursive: true, force: true }); }
|
||||
}
|
||||
|
||||
describe('validateEntryName', () => {
|
||||
it('accepts a normal nested path', () => {
|
||||
const out = validateEntryName('extension/package.json', DEFAULT_CAPS);
|
||||
assert.ok(out && out.includes('package.json'));
|
||||
});
|
||||
it('returns null for directory entries', () => {
|
||||
assert.equal(validateEntryName('extension/', DEFAULT_CAPS), null);
|
||||
});
|
||||
it('rejects parent traversal', () => {
|
||||
assert.throws(() => validateEntryName('../etc/passwd', DEFAULT_CAPS), /traversal/);
|
||||
});
|
||||
it('rejects deep parent traversal', () => {
|
||||
assert.throws(() => validateEntryName('extension/../../escape', DEFAULT_CAPS), /traversal/);
|
||||
});
|
||||
it('rejects POSIX absolute paths', () => {
|
||||
assert.throws(() => validateEntryName('/etc/passwd', DEFAULT_CAPS), /absolute/);
|
||||
});
|
||||
it('rejects Windows drive letters', () => {
|
||||
assert.throws(() => validateEntryName('C:\\Windows\\sys', DEFAULT_CAPS), /drive-letter|absolute/);
|
||||
});
|
||||
it('rejects backslash absolute paths', () => {
|
||||
assert.throws(() => validateEntryName('\\foo', DEFAULT_CAPS), /absolute/);
|
||||
});
|
||||
it('rejects NUL bytes', () => {
|
||||
assert.throws(() => validateEntryName('foo\u0000bar', DEFAULT_CAPS), /NUL/);
|
||||
});
|
||||
it('rejects empty entry names', () => {
|
||||
assert.throws(() => validateEntryName('', DEFAULT_CAPS), /empty/);
|
||||
});
|
||||
it('rejects very deep paths beyond depth cap', () => {
|
||||
const deep = Array.from({ length: 25 }, () => 'a').join('/');
|
||||
assert.throws(() => validateEntryName(deep, { ...DEFAULT_CAPS, maxDepth: 20 }), /depth/);
|
||||
});
|
||||
it('normalizes backslashes in path', () => {
|
||||
const out = validateEntryName('extension\\sub\\file.txt', DEFAULT_CAPS);
|
||||
assert.ok(out && (out.includes('sub') || out.includes('file.txt')));
|
||||
});
|
||||
});
|
||||
|
||||
describe('isSymlink', () => {
|
||||
it('detects unix-made symlink mode bits', () => {
|
||||
const entry = { versionMadeBy: (3 << 8) | 20, externalAttr: unixModeAttr(MODE_SYMLINK) };
|
||||
assert.equal(isSymlink(entry), true);
|
||||
});
|
||||
it('ignores mode bits when versionMadeBy os != Unix', () => {
|
||||
const entry = { versionMadeBy: (0 << 8) | 20, externalAttr: unixModeAttr(MODE_SYMLINK) };
|
||||
assert.equal(isSymlink(entry), false);
|
||||
});
|
||||
it('returns false for regular file', () => {
|
||||
const entry = { versionMadeBy: (3 << 8) | 20, externalAttr: unixModeAttr(0x81A4) };
|
||||
assert.equal(isSymlink(entry), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('extractToDir — happy path', () => {
|
||||
it('extracts a small ZIP with a nested file', async () => {
|
||||
const buf = buildZip([
|
||||
{ name: 'extension/package.json', data: '{"hello":"world"}' },
|
||||
{ name: 'extension/extension.js', data: 'console.log(1)' },
|
||||
]);
|
||||
await withTempDir(async (dir) => {
|
||||
const r = await extractToDir(buf, dir);
|
||||
assert.equal(r.entries, 2);
|
||||
const pkg = await readFile(join(dir, 'extension/package.json'), 'utf8');
|
||||
assert.match(pkg, /hello/);
|
||||
});
|
||||
});
|
||||
|
||||
it('extracts deflate-compressed entries', async () => {
|
||||
// Pseudo-random bytes so compression ratio stays well under the cap.
|
||||
const original = Buffer.alloc(2000);
|
||||
for (let i = 0; i < original.length; i++) original[i] = (i * 73 + 11) & 0xFF;
|
||||
const compressed = deflateRawSync(original);
|
||||
// Manually construct a buildZip-style entry but with method=8 + compSize set.
|
||||
// buildZip only supports STORE; we need a small bespoke builder for this test.
|
||||
// Use raw buildZip + override method by patching after.
|
||||
// Simpler: assert listEntries handles a deflate one we craft.
|
||||
// Construct manually:
|
||||
const nameBuf = Buffer.from('extension/big.txt', 'utf8');
|
||||
const lfh = Buffer.alloc(30);
|
||||
lfh.writeUInt32LE(0x04034b50, 0);
|
||||
lfh.writeUInt16LE(20, 4);
|
||||
lfh.writeUInt16LE(0, 6);
|
||||
lfh.writeUInt16LE(8, 8); // DEFLATE
|
||||
lfh.writeUInt32LE(0, 14); // CRC unused (we don't validate)
|
||||
lfh.writeUInt32LE(compressed.length, 18);
|
||||
lfh.writeUInt32LE(original.length, 22);
|
||||
lfh.writeUInt16LE(nameBuf.length, 26);
|
||||
lfh.writeUInt16LE(0, 28);
|
||||
const cd = Buffer.alloc(46);
|
||||
cd.writeUInt32LE(0x02014b50, 0);
|
||||
cd.writeUInt16LE(20, 4);
|
||||
cd.writeUInt16LE(20, 6);
|
||||
cd.writeUInt16LE(0, 8);
|
||||
cd.writeUInt16LE(8, 10); // DEFLATE
|
||||
cd.writeUInt32LE(0, 16);
|
||||
cd.writeUInt32LE(compressed.length, 20);
|
||||
cd.writeUInt32LE(original.length, 24);
|
||||
cd.writeUInt16LE(nameBuf.length, 28);
|
||||
cd.writeUInt32LE(0, 38);
|
||||
cd.writeUInt32LE(0, 42); // LFH at offset 0
|
||||
const eocd = Buffer.alloc(22);
|
||||
eocd.writeUInt32LE(0x06054b50, 0);
|
||||
eocd.writeUInt16LE(1, 8);
|
||||
eocd.writeUInt16LE(1, 10);
|
||||
eocd.writeUInt32LE(46 + nameBuf.length, 12);
|
||||
eocd.writeUInt32LE(30 + nameBuf.length + compressed.length, 16);
|
||||
const buf = Buffer.concat([lfh, nameBuf, compressed, cd, nameBuf, eocd]);
|
||||
|
||||
await withTempDir(async (dir) => {
|
||||
const r = await extractToDir(buf, dir);
|
||||
assert.equal(r.entries, 1);
|
||||
const out = await readFile(join(dir, 'extension/big.txt'));
|
||||
assert.equal(out.length, original.length);
|
||||
assert.equal(out.toString('utf8'), original.toString('utf8'));
|
||||
});
|
||||
});
|
||||
|
||||
it('lists entries without extracting', () => {
|
||||
const buf = buildZip([{ name: 'a.txt', data: 'x' }, { name: 'b.txt', data: 'yy' }]);
|
||||
const out = listEntries(buf);
|
||||
assert.equal(out.length, 2);
|
||||
assert.equal(out[0].name, 'a.txt');
|
||||
assert.equal(out[1].uncompSize, 2);
|
||||
});
|
||||
});
|
||||
|
||||
describe('extractToDir — adversarial', () => {
|
||||
it('rejects zip-slip via parent traversal', async () => {
|
||||
const buf = buildZip([{ name: '../escape.txt', data: 'pwned' }]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(() => extractToDir(buf, dir), /traversal/);
|
||||
const items = await readdir(dir);
|
||||
assert.equal(items.length, 0, 'no files should have been written');
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects zip-slip via absolute POSIX path', async () => {
|
||||
const buf = buildZip([{ name: '/tmp/leak.txt', data: 'pwned' }]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(() => extractToDir(buf, dir), /absolute|traversal/);
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects symlink entries', async () => {
|
||||
const buf = buildZip([{
|
||||
name: 'evil-link',
|
||||
data: '../../etc/passwd',
|
||||
versionMadeBy: (3 << 8) | 20,
|
||||
externalAttr: unixModeAttr(MODE_SYMLINK),
|
||||
}]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(() => extractToDir(buf, dir), /symlink/);
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects entries beyond maxEntries cap', async () => {
|
||||
const entries = Array.from({ length: 5 }, (_, i) => ({ name: `f${i}.txt`, data: 'x' }));
|
||||
const buf = buildZip(entries);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(
|
||||
() => extractToDir(buf, dir, { caps: { ...DEFAULT_CAPS, maxEntries: 3 } }),
|
||||
/too many/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects zip-bomb: STORED entry exceeding maxUncompressedBytes', async () => {
|
||||
const buf = buildZip([{
|
||||
name: 'bomb.txt',
|
||||
data: Buffer.alloc(2000),
|
||||
declaredUncompSize: 2000,
|
||||
}]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(
|
||||
() => extractToDir(buf, dir, { caps: { ...DEFAULT_CAPS, maxUncompressedBytes: 1000 } }),
|
||||
/maxUncompressedBytes/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects zip-bomb: deflate expansion ratio exceeds cap', async () => {
|
||||
// Build an entry with high uncompressed and tiny compressed claim.
|
||||
const original = Buffer.alloc(20_000); // 20KB of zeros — compresses tiny
|
||||
const compressed = deflateRawSync(original);
|
||||
const nameBuf = Buffer.from('bomb.bin', 'utf8');
|
||||
const lfh = Buffer.alloc(30);
|
||||
lfh.writeUInt32LE(0x04034b50, 0);
|
||||
lfh.writeUInt16LE(20, 4);
|
||||
lfh.writeUInt16LE(8, 8);
|
||||
lfh.writeUInt32LE(compressed.length, 18);
|
||||
lfh.writeUInt32LE(original.length, 22);
|
||||
lfh.writeUInt16LE(nameBuf.length, 26);
|
||||
const cd = Buffer.alloc(46);
|
||||
cd.writeUInt32LE(0x02014b50, 0);
|
||||
cd.writeUInt16LE(20, 4); cd.writeUInt16LE(20, 6); cd.writeUInt16LE(8, 10);
|
||||
cd.writeUInt32LE(compressed.length, 20);
|
||||
cd.writeUInt32LE(original.length, 24);
|
||||
cd.writeUInt16LE(nameBuf.length, 28);
|
||||
cd.writeUInt32LE(0, 42);
|
||||
const eocd = Buffer.alloc(22);
|
||||
eocd.writeUInt32LE(0x06054b50, 0);
|
||||
eocd.writeUInt16LE(1, 8); eocd.writeUInt16LE(1, 10);
|
||||
eocd.writeUInt32LE(46 + nameBuf.length, 12);
|
||||
eocd.writeUInt32LE(30 + nameBuf.length + compressed.length, 16);
|
||||
const buf = Buffer.concat([lfh, nameBuf, compressed, cd, nameBuf, eocd]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(
|
||||
() => extractToDir(buf, dir, { caps: { ...DEFAULT_CAPS, maxExpansionRatio: 5 } }),
|
||||
/expansion ratio|exceeds/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it('rejects unknown compression methods', async () => {
|
||||
// Manually craft an entry with method=6 (Implode, unsupported)
|
||||
const nameBuf = Buffer.from('weird.bin', 'utf8');
|
||||
const data = Buffer.from('x');
|
||||
const lfh = Buffer.alloc(30);
|
||||
lfh.writeUInt32LE(0x04034b50, 0);
|
||||
lfh.writeUInt16LE(6, 8); // method=Implode
|
||||
lfh.writeUInt32LE(data.length, 18);
|
||||
lfh.writeUInt32LE(data.length, 22);
|
||||
lfh.writeUInt16LE(nameBuf.length, 26);
|
||||
const cd = Buffer.alloc(46);
|
||||
cd.writeUInt32LE(0x02014b50, 0);
|
||||
cd.writeUInt16LE(6, 10);
|
||||
cd.writeUInt32LE(data.length, 20);
|
||||
cd.writeUInt32LE(data.length, 24);
|
||||
cd.writeUInt16LE(nameBuf.length, 28);
|
||||
const eocd = Buffer.alloc(22);
|
||||
eocd.writeUInt32LE(0x06054b50, 0);
|
||||
eocd.writeUInt16LE(1, 8); eocd.writeUInt16LE(1, 10);
|
||||
eocd.writeUInt32LE(46 + nameBuf.length, 12);
|
||||
eocd.writeUInt32LE(30 + nameBuf.length + data.length, 16);
|
||||
const buf = Buffer.concat([lfh, nameBuf, data, cd, nameBuf, eocd]);
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(() => extractToDir(buf, dir), /unsupported compression/);
|
||||
});
|
||||
});
|
||||
|
||||
it('throws ZipError when EOCD is missing', async () => {
|
||||
const garbage = Buffer.from('not a zip file at all');
|
||||
await withTempDir(async (dir) => {
|
||||
await assert.rejects(() => extractToDir(garbage, dir), /EOCD/);
|
||||
});
|
||||
});
|
||||
});
|
||||
Loading…
Add table
Add a link
Reference in a new issue