ktg-plugin-marketplace/plugins/ultraplan-local/examples/README.md
Kjell Tore Guttormsen 14ecda886c feat(voyage)!: bulk content rewrite ultra -> voyage/trek prose [skip-docs]
Sed-pipeline (16 patterns, longest-match-first) sweeper residuelle ultra*-treff
i prose, command-narrativ, agent-prompts, hook-kommentarer, doc-prosa.

Pipeline-utvidelser fra V4-prompten:
- BSD-syntax [[:<:]]ultra[[:>:]] istedenfor \bultra\b (BSD sed mangler \b)
- 6 compound-patterns for ultraplan/ultraexecute/ultraresearch/ultrabrief/
  ultrareview/ultracontinue uten -local-suffiks
- ultra*-stats glob -> trek*-stats glob
- Linje-eksklusjon redusert til ultra-cc-architect (Q8); session-state-
  eksklusjonen var over-protektiv
- File-eksklusjon utvidet til settings.json, package.json, plugin.json,
  hele .claude/-treet (gitignored + V5-territorium)

Q8-undantak holdt: architecture-discovery.mjs + project-discovery.mjs urort.
Filnavn-konvensjon holdt: .session-state.local.json + *.local.* preservert.

Manuell narrative-fix: tests/lib/agent-frontmatter.test.mjs linje 10
mangled "/ultra*-local" til "/voyage*-local" (ingen slik kommando finnes);
korrigert til "/trek*".

Residualer utenfor scope (V5 handterer): package.json + .claude-plugin/
plugin.json (Step 12-14 versjons-bump). .claude/* er gitignored
spec-historikk med tilsiktet BEFORE/AFTER-narrativ.

Part of voyage-rebrand session 3 (Wave 4 / Step 10).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-05 15:08:20 +02:00

73 lines
2.8 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# Examples
Complete kalibrerte walk-throughs of the trekplan pipeline for
realistic tasks. Each example shows the four artifacts a project
directory contains after a full run:
- `brief.md` — task brief from `/trekbrief`
- `research/*.md` — research briefs from `/trekresearch`
- `plan.md` — implementation plan from `/trekplan`
- `progress.json` — execution log from `/trekexecute`
These are **hand-calibrated**, not LLM-generated. The point is to give
a fork-er a deterministic reference — what the artifacts look like
when everything goes right, with a small but real task.
## Running pipeline yourself
For your own work, point the four commands at a real project directory:
```bash
mkdir -p .claude/projects/2026-05-01-my-task
/trekbrief
/trekresearch --project .claude/projects/2026-05-01-my-task
/trekplan --project .claude/projects/2026-05-01-my-task
/trekexecute --project .claude/projects/2026-05-01-my-task
```
The artifacts in each example mirror that flow.
## Examples
### 01-add-verbose-flag
**Task:** add a `--verbose` flag to a small CLI parser. Touches one
parser file and six command handlers; adds two tests.
**Why this example:** small enough to read end-to-end in 10 minutes,
but exercises every artifact (research with brief-anchoring, plan with
manifests, progress.json with multi-step git history). Demonstrates
how `plan_version: 1.7` schema looks in real life — including the
manifest YAML block per step and the `must_contain` list-of-dicts
form.
**What to study first:**
1. `brief.md` — note the explicit `Out of scope` section and concrete
`Success Criteria` (no "make it work" hand-waving).
2. `plan.md` Step 1 — note that the FIRST step captures golden output
*before* any behavior change. This is the stability harness pattern.
3. `plan.md` Step 5 — note that this step touches 5 files in one
commit, and the plan justifies the deviation from the 12 file
guideline. Plan-critic should accept that justification.
4. `progress.json` — every step has both `commit_sha` and
`verify_passed`. Resumes work from the last completed step.
## Regeneration
Each example has a `REGENERATED.md` documenting the version it was
calibrated against. When the artifact format changes, the example
needs to be re-built. See the `REGENERATED.md` file in each example
for triggers and procedure.
## Adding a new example
If you have a small, realistic task (touches 1-3 files, has a clear
success criterion, finishes in under 30 minutes) and want to add it
as an example:
1. Create `examples/NN-slug-here/` with the same four artifacts.
2. Add a `REGENERATED.md` documenting the calibration date and version.
3. Add a section to this README under `## Examples`.
4. Open an issue on the marketplace describing what the example
teaches that 01 doesn't already teach.