9e2973059a
* feat(ux): claude-mem UX improvements with installer enhancements
Squashed PR #2156 commits for clean rebase onto main:
- feat(installer): add provider selection, model prompt, worker auto-start
- refactor: rename *Agent provider classes to *Provider
- feat: add /learn-codebase skill and viewer welcome card
- feat(worker): inject welcome hint when project has zero observations
- fix(pr-2156): address greptile review comments
- fix(pr-2156): address coderabbit review comments
- fix(pr-2156): persist CLAUDE_MEM_PROVIDER for non-claude in non-TTY mode
- fix(pr-2156): file-backed settings reads in installer + env-first SKILL doc
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* build: rebuild plugin artifacts after rebase onto v12.4.7
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(skills): strip claude-mem internals from learn-codebase
The learn-codebase skill, install next-step copy, WelcomeCard, and
welcome-hint previously walked the primary agent through worker endpoints
and synthetic observation payloads. The PostToolUse hook already captures
every Read/Edit the agent makes — the agent should have no awareness that
the memory layer exists. Collapse the skill to one instruction ("read every
source file in full") and rephrase touchpoints to describe only what the
user observes (Claude reading files), not what happens behind the scenes.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(sync): preflight version mismatch + settings-aware port resolution
Two related fixes for build-and-sync's worker restart step:
1. Read CLAUDE_MEM_WORKER_PORT from ~/.claude-mem/settings.json the same
way the worker does, instead of computing the default port from the
uid alone. Previously, users with a custom port saw a misleading
"Worker not running" message because the restart POST hit the wrong
port and got ECONNREFUSED.
2. Add a preflight check that aborts the sync when the running worker's
reported version does not match the version we are about to build.
Claude Code's plugin loader pins the worker to a specific cache
version per session, so syncing into a newer cache directory has no
effect until the user runs `claude plugin update thedotmack/claude-mem`
to bump the pin. The preflight surfaces this explicitly with the exact
command to run; --force bypasses it for intentional cases.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* docs(learn-codebase): note sed for partial reads of large files
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor: strip comments codebase-wide
Removed prose comments from all tracked source. Preserved directives
(@ts-ignore, eslint-disable, biome-ignore, prettier-ignore, triple-slash
references, webpack magic, shebangs). Deleted two tests that asserted
on comment text rather than runtime behavior.
Net: 401 files, -14,587 / +389 lines, -10.4% bytes.
Verified: typecheck passes, build passes, test count unchanged from
baseline (22 pre-existing fails, all unrelated).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(installer): move runtime setup into npx, eliminate hook dead air
Smart-install ran 3 times during a fresh install — the worst run was silent,
fired by Claude Code's Setup hook after `claude plugin install`, producing
~30s of dead air that looked like the plugin was hung.
This change makes `npx claude-mem install` the single place heavy work
happens, with a visible spinner. Hooks become runtime-only.
- New `src/npx-cli/install/setup-runtime.ts` module: ensureBun, ensureUv,
installPluginDependencies, read/writeInstallMarker, isInstallCurrent.
Marker schema preserved exactly ({version, bun, uv, installedAt}) so
ContextBuilder and BranchManager readers keep working.
- `npx claude-mem install`: ungated copy/register/enable for every IDE,
inserts a "Setting up runtime" task with honest "first install can take
~30s" spinner. The claude-code shell-out to `claude plugin install` is
removed — npx already populated everything Claude reads.
- New `npx claude-mem repair` command for post-`claude plugin update`
recovery, force-reinstalls runtime.
- Setup hook now runs `plugin/scripts/version-check.js` (29ms wall) instead
of smart-install. Mismatch prints "run: npx claude-mem repair" on stderr.
Always exits 0 (non-blocking, per CLAUDE.md exit-code strategy).
- SessionStart loses the smart-install entry; 2 hooks remain (worker start,
context fetch).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(installer): delete smart-install sources, retarget tests
- Delete scripts/smart-install.js + plugin/scripts/smart-install.js (both
are source files kept in sync manually; both must go).
- Delete tests/smart-install.test.ts (covered surface is gone).
- tests/plugin-scripts-line-endings: drop smart-install.js entry.
- tests/infrastructure/plugin-distribution: retarget two assertions at
version-check.js (the new Setup hook script).
- New tests/setup-runtime.test.ts: 9 tests covering marker read/write,
isInstallCurrent semantics. Marker schema invariant verified.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* docs(installer): describe npx-driven setup + version-check Setup hook
Sweep public docs and architecture notes to reflect the new flow:
npx installer does Bun/uv setup with a visible spinner; Setup hook runs
sub-100ms version-check.js; users hit `npx claude-mem repair` after a
`claude plugin update`.
- docs/architecture-overview.md: hook lifecycle table + npx flow paragraph
- docs/public/configuration.mdx: tree + hook config example
- docs/public/development.mdx: build output line
- docs/public/hooks-architecture.mdx: full rewrite of pre-hook section,
timing table, performance table
- docs/public/architecture/{overview,hooks,worker-service}.mdx: tree
comments, JSON config example, Bun requirement section
docs/reports/* untouched (historical incident reports).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): mergeSettings writes via USER_SETTINGS_PATH
Greptile P1 (#2156): `settingsFilePath()` only resolved
`process.env.CLAUDE_MEM_DATA_DIR`, while `getSetting()` reads via
`USER_SETTINGS_PATH` which `resolveDataDir()` populates from BOTH the env
var AND a `CLAUDE_MEM_DATA_DIR` entry persisted in
`~/.claude-mem/settings.json`. Result: a user with the data dir saved in
settings.json but not exported in their shell would have provider/model
settings silently written to `~/.claude-mem/settings.json` while
`getSetting()` read from `/custom/path/settings.json` — read/write split.
Drop `settingsFilePath()` and the now-unused `homedir` import; reuse the
already-imported `USER_SETTINGS_PATH` constant.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(cli): parse --provider, --model, --no-auto-start install flags
Greptile P1 (#2156): InstallOptions has fields `provider`, `model`,
`noAutoStart`, but the install case in the npx-cli switch only parsed
`--ide`. The other three flags were silently dropped — `npx claude-mem
install --provider gemini` was a no-op.
Extract a `parseInstallOptions(argv)` helper, share it between the bare
`npx claude-mem` and `npx claude-mem install` paths, and validate
`--provider` against the allowed set. Update help text accordingly.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): pipe runtime-setup output, always show IDE multiselect
Two issues caught in a docker test of the installer:
1. The bun.sh installer, uv installer, and `bun install` were using
stdio: 'inherit', dumping their stdout/stderr through clack's spinner
region — visible as raw "downloading uv 0.11.8…" / "Checked 58
installs across 38 packages…" text streaming under the spinner. Switch
to stdio: 'pipe' and surface captured stderr only on failure (via a
shared describeExecError() helper that includes stdout when stderr is
empty). Spinner stays clean on the happy path.
2. promptForIDESelection() silently picked claude-code when no IDEs were
detected, never showing the user the multiselect. On a fresh machine
with no IDEs present yet (e.g. our docker test container), the user
never got to choose. Now: always show the full IDE list when
interactive; mark detected ones with [detected] hints and pre-select
them; show a warn line if zero are detected explaining they should pick
what they plan to use. Non-TTY callers still get the silent
claude-code default at the call site (unchanged).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): skip marketplace work for claude-code-only, offer to install Claude Code
Two related UX fixes from a docker test:
**Delay between "Saved Claude model=…" and "Plugin files copied OK"**
After dropping the needsManualInstall gate, every install was unconditionally
running `copyPluginToMarketplace` (which copied the entire root node_modules
tree — thousands of files, dozens of seconds) and `runNpmInstallInMarketplace`
(npm install --production) even when only claude-code was selected. Neither
is needed for claude-code: that path uses the plugin cache dir + the
installed_plugins.json + enabledPlugins flag, all of which we already write.
- Drop `node_modules` from `copyPluginToMarketplace`'s allowed-entries list;
the dependency-install task populates it on the destination side anyway.
- Re-introduce `needsMarketplace = selectedIDEs.some(id => id !== 'claude-code')`
scoped *only* to `copyPluginToMarketplace`, `runNpmInstallInMarketplace`,
and the pre-install `shutdownWorkerAndWait` (also pointless for claude-code-
only flows since we're not overwriting the worker's running cache dir
source). All other tasks (cache copy, register, enable, runtime setup) stay
unconditional.
**Claude Code missing → silent install of an IDE that isn't there**
When the user picked claude-code on a machine without it (e.g. a fresh
container), the install completed but `claude` was unavailable and the only
hint was a generic warn line. Replace with an explicit pre-flight prompt:
Claude Code is not installed. Claude-mem works best in Claude Code, but
also works with the IDEs below.
? Install Claude Code now?
◆ Yes — install Claude Code (recommended)
◯ No — pick another IDE below
◯ Cancel installation
If the user picks "Yes", run `curl -fsSL https://claude.ai/install.sh | bash`
(or the PowerShell equivalent on Windows), then re-detect IDEs and proceed
with claude-code pre-selected. If the install fails or the user picks "No",
the multiselect still appears with claude-code visible (just unmarked
[detected]), so they can opt in or pick another IDE.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): detect Claude Code via `claude` CLI, not ~/.claude dir
The directory `~/.claude` can exist (e.g. mounted in Docker, or created
by tooling) without Claude Code actually being installed. Detect the
`claude` command in PATH instead so the installer correctly offers to
install Claude Code when missing.
* docs(learn-codebase): add reviewer note explaining the cost tradeoff
The skill intentionally reads every file in full to build a cognitive
cache that pays off across the rest of the project. Add a brief note
so reviewers (human or bot) understand the tradeoff before flagging
the unbounded read as a cost issue.
* fix: address Greptile P1 feedback on welcome hint and learn-codebase
- SearchRoutes: skip welcome hint when caller passes ?full=true so
explicit full-context requests aren't intercepted by the hint.
- learn-codebase: replace `sed` instruction with the Read tool's
offset/limit parameters, since Bash is gated in Claude Code by
default.
* feat(install): ASCII-animated logo splash on interactive install
Plays a ~1s bloom animation of the claude-mem sunburst logomark when
the installer starts in an interactive terminal — geometrically rendered
via 12 ray curves around a center disc, in the brand orange. The
wordmark and tagline type on alongside the final frame.
Auto-skipped on non-TTY, in CI, when NO_COLOR or CLAUDE_MEM_NO_BANNER
is set, or when the terminal is too narrow.
Inspired by ghostty +boo.
* feat(banner): replace rotation frames with angular-sector bloom generator
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(banner): replace rotation frames with angular-sector bloom generator
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(banner): three-act choreography renderer with radial gradient and diff redraw
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(banner): update preview script to support small/medium/hero tier selection
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(docker): add COLORTERM=truecolor to test-installer sandbox
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(install): auto-apply PATH for Claude Code with spinner UX
The Claude Code install.sh prints a Setup notes block telling users to
manually edit "your shell config file" to add ~/.local/bin to PATH —
which left fresh installs unable to launch claude from the command line.
After a successful install, detect ~/.local/bin/claude on disk and, if
the dir is missing from PATH, append the right export line to .zshrc /
.bash_profile / .bashrc / fish config (idempotent, marked with a
comment). Also updates process.env.PATH for the current install run.
Wraps the curl|bash install in a clack spinner (interactive only) so the
~4 minute native-build download doesn't look frozen — output is captured
silently and dumped on failure for debuggability. Non-interactive mode
keeps inherited stdio for CI logs.
Verified end-to-end in the test-installer docker sandbox: spinner
animates, .bashrc gets the export, fresh login shell resolves claude.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(banner): video-frame ASCII renderer with three-act choreography
Generator switched from a single Jimp-rendered logo to pre-extracted
video frames concatenated with \x01 separators and gzip-deflated, ported
from ghostty's boo wire format. Renderer rewritten around three acts
(ignite → stagger bloom → text reveal + breathe) with adaptive sizing,
radial gradient, and diff-based redraw.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(onboarding): unify install / SessionStart / viewer around one first-success moment
Three surfaces now point at the same north-star moment — open the viewer, do
anything in Claude Code, watch an observation appear within seconds — with the
same verbatim timing and privacy lines, and a single canonical "how it works"
explainer instead of three diverging copies.
- Canonical explainer at src/services/worker/onboarding-explainer.md served via
GET /api/onboarding/explainer; mirrored into plugin/skills/how-it-works/SKILL.md
- SessionStart welcome hint rewritten as third-person status (no imperatives
Claude tries to execute), pinned with a default-value regression test
- Post-install Next Steps reframed as "two paths": passive default + optional
/learn-codebase front-load; drops /mem-search and /knowledge-agent from this
surface; adds verbatim timing + privacy lines and /how-it-works link
- /api/stats response gains firstObservationAt for the viewer stat row
- Viewer WelcomeCard branches on observationCount === 0: empty state shows live
worker-connection dot + "waiting for activity"; has-data state shows
observations · projects · since [date] and two example prompts. v2 dismiss key
- jimp added to package.json to fix pre-existing banner-frame build break
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(banner): play unconditionally; only honor CLAUDE_MEM_NO_BANNER
The 128-col / TTY / CI / NO_COLOR gates silently swallowed the banner in
narrower terminals, CI logs, and any non-TTY pipe — including Docker runs
where -it should preserve the experience but column width was the wrong
gate. Remove the implicit gates; keep the explicit opt-out only.
If a frame wraps in a narrow terminal, that's better than the banner
not playing at all.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* revert(banner): restore 15:33 gating logic per user request
Reverts eb6fc157. Restores isBannerEnabled to the state at commit
8e448015 (2026-04-30 15:33): TTY check, !CI, !NO_COLOR, !CLAUDE_MEM_NO_BANNER,
and cols >= BANNER.width.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(install): wrap remaining slow steps with spinners
Each IDE installer (Cursor, Gemini CLI, OpenCode, Windsurf, OpenClaw,
Codex CLI, MCP integrations) now runs inside a clack task spinner with
per-step progress messages instead of silent dynamic-import + cpSync.
Pre-overwrite worker shutdown (up to 10s) and the post-install health
probe (up to 3s) also get spinners.
Internal console.log/error/warn from each IDE installer is buffered
during the spinner; if the install fails, captured output is replayed
afterward via log.warn so users can see what broke.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(review): observation count + IDE pre-selection regressions
WelcomeCard's "no observations yet" empty state was triggered when a
project filter narrowed the feed to zero rows, even with thousands of
observations elsewhere. Source the count from global stats.database
to match firstObservationAt's scope.
Restore initialValues: [] in the IDE multiselect — pre-selecting every
detected IDE was the exact regression #2106 was filed for.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): trichotomy worker state + cache fallback for script path
ensureWorkerStarted now returns 'ready' | 'warming' | 'dead' instead of
boolean. The spawned-but-still-warming case (common in Docker cold
starts and slow first-time inits) was being misreported as 'did not
start', which contradicted the next-steps panel saying 'still starting
up'. Install task message and Next Steps headline now agree on the
actual state.
Also fixes the actual root cause of 'Worker did not start' on
claude-code-only installs: the worker script path was hardcoded to the
marketplace dir, which is left empty when no non-claude-code IDE is
selected. Now falls back to pluginCacheDirectory(version) when the
marketplace copy isn't present.
Verified end-to-end in docker/claude-mem with --ide claude-code,
--ide cursor, and a fresh container — install task and headline
agree on 'Worker ready at http://localhost:<port>' in all cases.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* docs: align CLAUDE.md and public docs with current code
Sweep across CLAUDE.md and 10 high-traffic docs/public/ MDX files to
remove point-in-time references and align with the actual current
shape of the codebase. Highlights:
- Hardcoded port 37777 → per-user formula (37700 + uid % 100) on the
front-door pages (introduction, installation, configuration,
architecture/overview, architecture/worker-service, troubleshooting,
hooks-architecture, platform-integration).
- Default model 'sonnet' → 'claude-haiku-4-5-20251001' (matches
SettingsDefaultsManager).
- Node 18 → 20 (matches package.json engines).
- Lifecycle hook count corrected (5 events).
- Removed the nonexistent 'Smart Install' component and pre-built
directory tree referencing files that no longer exist
(context-hook.ts, save-hook.ts, cleanup-hook.ts, etc.); replaced
with the real worker dispatcher shape.
- Removed CLAUDE.md '#2101' issue tag (kept the design rationale).
- Replaced obsolete hooks.json example with a description of the real
bun-runner.js / worker-service.cjs hook event shape.
Lower-traffic doc pages still hardcode 37777 — left for a separate
global pass.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(scripts): land strip-comments around real parsers (postcss, remark, parse5)
Each language gets a real parser to locate comments, then we splice ranges
out of the original source. The library never serializes — that's how
remark-stringify produced 243 reformat-noise diffs in the first attempt
versus the 21 real strip targets here.
JS/TS/JSX -> ts.createSourceFile + getLeadingCommentRanges
CSS/SCSS -> postcss.parse + walkComments + node.source offsets
MD/MDX -> remark-parse (+ remark-mdx) + AST html / mdx-expression nodes
HTML -> parse5 with sourceCodeLocationInfo
shell/py -> kept hand-rolled hash stripper (no library worth the dep)
Preserves: shebangs, @ts-* directives, eslint-disable, biome-ignore,
prettier-ignore, triple-slash refs, webpack magic, /*! license keep,
@strip-comments-keep file marker. JS/TS handler runs a parse-roundtrip
check and refuses to write if syntax errors increased (catches the
worker-utils.ts breakage class from the 2026-04-29 attempt).
npm scripts:
strip-comments (apply)
strip-comments:check (CI-style, exits non-zero if changes needed)
strip-comments:dry-run (list, no writes)
Verified --check on this repo: 21 changes, -4.0% bytes, no parse-error
regressions, no reformat-suspect false positives.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor: strip comments codebase-wide via parser-backed tool
21 files changed, -17,550 bytes (-4.0%) of narrative comments removed
across .ts / .tsx / .js / .mjs and the .gitignore. JS/TS comments stripped
via ts.createSourceFile + getLeadingCommentRanges — same canonical lexer,
same behavior as the 2026-04-29 strip, no reformat noise.
Preexisting baseline (unchanged):
typecheck: 16 errors at HEAD, 16 errors after strip (line numbers shift,
no new error classes — verified via diff of sorted error lists)
build: fails at HEAD with CrushHooksInstaller.js unresolved import
(preexisting, unrelated to this strip)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(install): drop Crush integration references after extract
The Crush integration was extracted to its own branch on May 1, but the
import at install.ts:280 (and the case block + ide-detection entry +
McpIntegrations config + npx-cli help text) still referenced the now-
removed CrushHooksInstaller.js, breaking the build.
Removes:
- case 'crush' block in install.ts
- crush entry in ide-detection.ts
- CRUSH_CONFIG and registration in McpIntegrations.ts
- 'crush' from the IDE Identifiers help line in index.ts
Rebuilds worker-service.cjs to match.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(banner): mark generated banner-frames.ts with @strip-comments-keep
Without this, every build/strip cycle ping-pongs five lines of doc
comments in and out of the auto-generated output. The keep-marker tells
strip-comments.ts to skip the file entirely.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(build): drop banner-frame regen from build script
generate-banner-frames.mjs requires PNG frames in /tmp/cmem-banner-frames
that only exist after the maintainer runs ffmpeg locally on the source
video. CI has neither the video nor the frames, so the build broke on
Windows. The output (src/npx-cli/banner-frames.ts) is committed, so the
regen is a one-shot dev step — not a build step. Run the script directly
when the video changes.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(worker): unstick the spinner — kill claim-self-lock, wake on fail, auto-broadcast
Three surgical changes that cure the stuck-spinner bug at the source.
Phase 1.1 (L9): claimNextMessage no longer self-excludes its own worker_pid.
A single UPDATE-RETURNING grabs the oldest pending row by id. Removes the
LiveWorkerPidsProvider plumbing that was never injected — Supervisor enforces
single-worker via PID file, so the multi-worker SQL was defending against a
configuration the project does not support.
Phase 1.2 (L19): SessionManager.markMessageFailed wraps PendingMessageStore.markFailed
and emits 'message' on the per-session EventEmitter. The iterator's waitForMessage
now wakes immediately on re-pend instead of parking for 3 minutes. ResponseProcessor
and SessionRoutes routed through the new wrapper.
Phase 1.3 (L24): PendingMessageStore takes an optional onMutate callback fired
from every mutator (enqueue, claimNextMessage, confirmProcessed, markFailed,
transitionMessagesTo, clearFailedOlderThan). SessionManager wires it; WorkerService
passes broadcastProcessingStatus. Ten manual broadcast calls deleted across
SessionCleanupHelper, SessionEventBroadcaster, SessionRoutes, DataRoutes, and
worker-service. Caller discipline becomes structurally impossible to forget.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): delete dead code — legacy routes, processPendingQueues, decorative guards
Pure deletions. Phase 2 of kill-the-asshole-gates.
- Legacy /sessions/:sessionDbId/* routes (handleSessionInit, handleObservations,
handleSummarize, handleSessionStatus, handleSessionDelete, handleSessionComplete)
bypassed all five ingest gates and were a parallel write path. Folded the
initializeSession + broadcastNewPrompt + syncUserPrompt + ensureGeneratorRunning
+ broadcastSessionStarted work into the canonical /api/sessions/init handler so
the hook makes one round trip instead of two.
- processPendingQueues (~104 lines, zero callers) — replaced in Phase 6 by a
one-statement startup sweep.
- spawnInProgress Map and crashRecoveryScheduled Set — decorative dedupe over
generatorPromise and stillExists checks that already provide the real safety.
- STALE_GENERATOR_THRESHOLD_MS — pre-empted live generators and raced with the
finally block; the 3min idle timeout already kills zombies.
- MAX_SESSION_WALL_CLOCK_MS — ran a SELECT on every observation to enforce 24h.
Runaway-spend protection lives in the API key, not in claude-mem.
- Missing-id 400 in shared.ts ingestObservation — Zod already enforces min(1)
on contentSessionId and toolName at the route schema.
- SessionCompletionHandler import + completionHandler field on SessionRoutes
(orphaned after handler deletions).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): SQL-backed getTotalQueueDepth — single source of truth
Was: iterate this.sessions.values() and sum getPendingCount per session.
Now: SELECT COUNT(*) FROM pending_messages WHERE status IN ('pending','processing').
The in-memory sessions Map drifted from the DB rows whenever a generator exited
without confirm/fail, leading to false-positive isProcessing in the UI. Phase 1.3's
auto-broadcast fires on every mutation, but it broadcast a stale Map count.
Reading from the DB makes the UI's spinner state match what the queue actually holds.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): typed abortReason replaces wasAborted boolean
Was: a boolean wasAborted that lumped every abort together. The finally block
branched on !wasAborted, so any abort skipped restart — including idle aborts
with pending work, which is exactly the case where we DO want to restart.
Now: ActiveSession.abortReason is a typed enum 'idle' | 'shutdown' | 'overflow'
| 'restart-guard'. The finally block consumes the reason and only skips restart
for 'shutdown' and 'restart-guard'. Idle and overflow aborts fall through, so
if pending work exists they trigger restart correctly.
Dropped 'stale' and 'wall-clock' from the union — Phase 2 deleted those paths.
Natural-completion abort (post-success) intentionally has no reason; it's not
gating restart logic.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): unify the two generator-exit finally blocks
Was: worker-service.ts:startSessionProcessor and SessionRoutes:ensureGeneratorRunning
each had their own ~70-line finally block with divergent restart-guard handling.
The worker-service path called terminateSession on RestartGuard trip and orphaned
pending rows (the L16 bug); the SessionRoutes path drained them. Two places to
update when rules changed.
Now: handleGeneratorExit in src/services/worker/session/GeneratorExitHandler.ts
owns the contract:
1. Always kill the SDK subprocess if alive.
2. Always drain processingMessageIds via sessionManager.markMessageFailed
(which wakes the iterator — Phase 1.2).
3. shutdown / restart-guard reasons: drain pending rows via
transitionMessagesTo('failed'), finalize, remove from Map. Fixes L16.
4. pendingCount=0: finalize normally and remove from Map.
5. pendingCount>0: backoff respawn via per-session respawnTimer (no global Set;
Phase 2.4 deleted that). RestartGuard trip drains to 'abandoned'.
Both finally blocks are now ~10-line wrappers that translate local state into the
canonical abortReason and delegate. Restored completionHandler injection into
SessionRoutes (was dropped in Phase 2 cleanup; needed by the unified helper for
finalizeSession).
Behavior change: SessionRoutes' previous "keep idle session in memory" was
deliberately replaced by the plan's "remove from Map on natural completion" —
next observation reinitializes via getMessageIterator → initializeSession.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(worker): startup orphan sweep — reset 'processing' rows at boot
When the worker dies (crash, kill, restart), any pending_messages rows it left
in 'processing' state are by definition orphans (the only worker is dead).
Single SQL UPDATE at boot resets them to 'pending' so the iterator can claim
them again. Replaces the deleted processPendingQueues function (Phase 2.2).
Runs in initializeBackground after dbManager.initialize() and before the
initializationComplete middleware releases blocked HTTP requests, so no
in-flight request can race the sweep. NOT on a periodic timer — after boot,
every 'processing' row has a live consumer and a periodic sweep would race.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): simplify enqueue catch, replace memorySessionId throw with re-pend
7.1: queueObservation's catch was logging two ERROR-level messages and rethrowing.
The rethrow is correct (FK violations / disk full / schema drift should crash
loudly), but the verbose ERROR logging pretended the error was recoverable.
Reduced to one INFO line + rethrow.
7.2: ResponseProcessor's memorySessionId guard was throwing if the SDK hadn't
included session_id on the first user-yield, terminal-failing the entire batch.
Now warns and re-pends in-flight messages via sessionManager.markMessageFailed
(which wakes the iterator — Phase 1.2). The next iteration tries again with
memorySessionId hopefully captured.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(sync): mirror builds to installed-version cache for hot reload
When package.json bumps past Claude Code's installed pin, sync-marketplace
wrote new code to cache/<buildVersion>/ but the worker loaded from
cache/<installedVersion>/, so worker:restart reloaded the same old code.
Replace the exit-on-mismatch preflight with a mirror step: when versions
differ, also rsync plugin/ into cache/<installedVersion>/ so worker:restart
hot-reloads new code without a Claude Code session restart. The
build-version cache still gets written for the eventual
`claude plugin update`.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore: delete dead barrel files and orphan utilities
- src/sdk/index.ts (re-exports parser+prompts; nothing imported the barrel)
- src/services/Context.ts (re-exports ./context/index.js; no importers)
- src/services/integrations/index.ts (no importers)
- src/services/worker/Search.ts (3-line barrel of ./search/index.js)
- src/services/infrastructure/index.ts: drop CleanupV12_4_3 re-export
- src/utils/error-messages.ts (getWorkerRestartInstructions never imported)
- src/types/transcript.ts (170 LoC of types, zero importers)
- src/npx-cli/_preview.ts (banner dev preview, no script wires it)
Build + tests still pass; observations still flowing.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(parser): drop unused detectLanguage
Only the user-grammar-aware variant detectLanguageWithUserGrammars()
is actually called.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(types): drop unused SdkSessionRecord + ObservationWithContext
Both interfaces in src/types/database.ts had zero importers anywhere
in src or tests.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(npx-cli): drop unused getDetectedIDEs + claudeMemDataDirectory
getDetectedIDEs has no callers — install.ts uses detectInstalledIDEs
directly. claudeMemDataDirectory has no callers either.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(ProcessManager): drop dead orphan-reaper + signal-handler helpers
Each had zero callers in src/ or tests/:
- cleanupOrphanedProcesses + enumerateOrphanedProcesses
- ORPHAN_PROCESS_PATTERNS + ORPHAN_MAX_AGE_MINUTES
- forceKillProcess
- waitForProcessesExit
- createSignalHandler
- resetWorkerRuntimePathCache
The orphan reaper was retired in PATHFINDER Plan 02 ("OS process groups
replace hand-rolled reapers", commit 94d592f2) — these were the leftover
pieces. shutdown.ts uses the supervisor's own kill-pgid path instead.
parseElapsedTime kept (covered by tests/infrastructure/process-manager.test.ts).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(scripts): delete 11 unreferenced DX/forensic scripts
None of these are referenced by package.json npm scripts or docs/.
All last touched on Apr 29 only as part of the comment-stripping
pass — the feature code itself is older and orphaned:
analyze-transformations-smart.js
debug-transcript-structure.ts
dump-transcript-readable.ts
endless-mode-token-calculator.js
extract-prompts-to-yaml.cjs
extract-rich-context-examples.ts
find-silent-failures.sh
fix-all-timestamps.ts
format-transcript-context.ts
test-transcript-parser.ts
transcript-to-markdown.ts
These are standalone tools — runtime behavior unchanged.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(scripts): delete unused extraction/ and types/ subdirs
- scripts/extraction/{extract-all-xml.py, filter-actual-xml.py, README.md}
point at ~/Scripts/claude-mem/ — the user's pre-relocation path that no
longer exists. Zero references in package.json, src/, or tests/.
- scripts/types/export.ts duplicates ObservationRecord etc. and has no
importers (CodexCliInstaller imports transcripts/types, not this).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(BranchManager): drop dead getInstalledPluginPath
OpenCodeInstaller has its own (used) getInstalledPluginPath; the
BranchManager copy never had any external callers.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(ChromaSyncState): unexport DocKind (used internally only)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* test(gemini): drop stale earliestPendingTimestamp / processingMessageIds
Both fields were removed from ActiveSession in earlier queue-engine
cleanup. Tests had been silently keeping them because the mock sessions
use 'as any' to bypass strict typing, so the dead fields rode along
without complaint.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore: drop 3 unused module-level constants
- src/npx-cli/banner.ts: CURSOR_HOME, CLEAR_DOWN (banner uses
CLEAR_SCREEN which combines clear-down + cursor-home into a single
CSI sequence; the standalone constants were leftovers).
- src/services/worker/BranchManager.ts: DEFAULT_SHELL_TIMEOUT_MS
(BranchManager only uses GIT_COMMAND_TIMEOUT_MS / NPM_INSTALL_TIMEOUT_MS).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(opencode-plugin): drop dead workerPost helper
Only the fire-and-forget variant (workerPostFireAndForget) is actually
called. workerPost was the await-result version with no remaining caller.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore: drop 8 truly-unused interface fields
Verified each by grepping for `.field`, `"field"`, `'field'`, and
`field:` patterns across src/ + tests/ + plugin/scripts. Where the
only remaining usage was the assignment site, removed the assignments too.
- GitHubStarsData: watchers_count, forks_count (only stargazers_count read)
- TableColumnInfo: dflt_value (PRAGMA returns it but no caller reads it)
- IndexInfo: seq (PRAGMA returns it but no caller reads it)
- ObservationRecord: source_files (legacy field, no readers)
- HookResult.hookSpecificOutput: permissionDecisionReason
- WatchTarget: rescanIntervalMs (set in config, never read)
- ShutdownResult: confirmedStopped (write-only — assigned but no
reader; updated all 3 return sites to drop it)
- ModePrompts: language_instruction (multilingual support never wired)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(npx-cli): reuse InstallOptions type instead of inline duplicate
parseInstallOptions had its return type written out inline as an
anonymous duplicate of InstallOptions. Use the canonical type
(import type — zero bundle cost).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* chore(integrations): drop unused Platform type alias
The detectPlatform() function that returned this type was deleted earlier
in the branch (along with getScriptExtension that consumed it). The type
itself outlived its consumer; only string literals "Platform:" survive in
console.log diagnostics, which don't reference the alias.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(worker): broadcast processing_status when summarize is queued
broadcastSummarizeQueued was an empty no-op even though
handleSummarizeByClaudeId calls it after enqueueing. The PendingMessageStore
onMutate callback already fires broadcastProcessingStatus on enqueue, but
calling it explicitly from broadcastSummarizeQueued ensures the spinner
ticks on the moment a summary is requested even if the onMutate chain has
any timing race.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(worker): keep spinner on while summary generates
ClaudeProvider's SDK can pull multiple synthetic prompts (e.g.
observation + summarize) before producing responses. Each pull pushed
an ID to session.processingMessageIds. When the SDK's first
observation response came back, ResponseProcessor.confirmProcessed
deleted ALL pending message rows — including the still-in-flight
summary — so getTotalQueueDepth dropped to 0 and the spinner turned
off, even though the summary took another ~22s to actually generate.
Tag each in-flight message with its type ({id, type}) so the response
processor can pop only the FIFO message of the matching type
(observation vs summarize). The summary row stays in 'processing'
until its own response arrives, keeping the spinner lit through the
entire summary window.
Also updates Gemini/OpenRouter providers and GeneratorExitHandler for
the new shape.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(worker): clear summary from queue on any SDK response
Switch ResponseProcessor from type-aware FIFO matching to strict FIFO
popping (each SDK response → 1 in-flight message consumed). This way
the summary always clears when the SDK responds, even when the
response is unparseable or the summary doesn't actually generate
content — preventing stuck spinner / queue-depth-stuck-at-1.
Spinner behavior is preserved: messages enqueued after the summary
keep the queue depth elevated, and only when the SDK has responded
to every prompt does the queue drain to zero.
Also: when the consumed message is a 'summarize' and parsing fails,
treat it as best-effort and confirmProcessed (no retry) — summaries
that can't be parsed shouldn't keep retrying.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(viewer): redesign welcome card and remove source filters
The first-start welcome card now explains the three feed card types
(observation/summary/prompt) with color-coded badges, points users at
the gear icon for settings and the project dropdown for filtering, and
plugs /mem-search for recall — replacing the old two-line "ask:" prompts.
Source filter tabs (Claude/Codex/etc.) are removed from the header.
Filtering by AI provider was nonsense from a user POV; the project
dropdown is the only header filter now. Source tracking is also
stripped from useSSE, usePagination, App state, and CSS.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(viewer): keep welcome card in feed column, swap rows for 3 squares
Two visible problems in the previous design: the card stretched
edge-to-edge while feed cards sit in a centered 650px column, and
the body was a stack of long horizontal rows that scanned line-by-line.
Both fixed: Feed now accepts a pinnedTop slot so the welcome card
renders inside the same .feed-content column as observation cards.
Body is now a 3-column grid of square feature blocks — Live feed,
Tune it, Recall it — each with a custom inline SVG illustration
(stacked cards with color-coded stripes, gear+sliders, magnifier
over cards). Old text-row sections (welcome-card-types,
welcome-card-tips, welcome-card-section, welcome-card-tip-icon)
are removed. Squares stack to one column under 600px.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* feat(viewer): convert welcome card to glassy modal with stylized logo
Card now opens as a centered modal with a frosted/glass backdrop
(blur + saturate) so it doubles as a proper help dialog when reopened
from the header's question-mark button. Removed the observation count,
project count, and "since" date — those don't make sense for a
first-launch surface and felt out of place in a help context.
Header art swapped from the small webp logomark to the new
high-resolution sun/sunburst PNG (claude-mem-logo-stylized.png),
shipped as a checked-in asset in src/ui and plugin/ui.
Bigger throughout: 28px h2, 16px tagline, 88px illustrations,
26px feature padding, 1:1 aspect-ratio squares. Backdrop click and
Esc both close. Mobile collapses the grid to one column and drops
the aspect-ratio constraint.
Reverted the unused pinnedTop slot on Feed.tsx since the welcome
card is now a true overlay rather than an in-feed pinned card.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(viewer): make welcome modal actually glassy
Previous version had a 55%-opacity black backdrop that almost fully
blocked the underlying UI — the "glass" was just a dark plate.
Now the backdrop is fully transparent (no darkening at all), the
panel itself drops to 55% bg-card opacity with its existing
backdrop-filter blur(28px) saturate(170%), and the feature squares
drop to 35% bg-tertiary so they layer as glass-on-glass over the
already-blurred panel. The header and feed below now read clearly
through the modal's frosted blur.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(viewer): bulletproof square features via padding-bottom + clamp() fluid type
Squares were rendering taller than wide because aspect-ratio is treated
as a minimum — content can push the box past 1:1. Switched to the
classic padding-bottom: 100% trick: percentage padding resolves against
the parent's width, so the box is ALWAYS W × W regardless of content.
Inner content sits in an absolutely-positioned flex column that can't
push the shell taller.
Whole modal is now desktop-first and fluid via clamp() — no media-query
stair-steps for type, padding, gaps, border-radius, illustration size,
or modal width. Single mobile breakpoint at <600px collapses the grid
to one column and reverts the padding-bottom trick so each feature can
grow to natural content height.
Tightened the three feature descriptions so they fit comfortably inside
the square at the desktop size.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* style(viewer): 15% black overlay + heavier modal shadow for elevation
Backdrop goes from transparent to rgba(0,0,0,0.15) — just enough
darkening to push the modal visually forward without burying the
underlying UI. Modal shadow stacked: 40px/120px ambient + 16px/48px
contact, both deeper, plus the existing inset 1px highlight.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(build): clear pending_messages queue on build-and-sync
Rewrites scripts/clear-failed-queue.ts to talk directly to SQLite via
bun:sqlite — the previous HTTP endpoints (/api/pending-queue/*) were
removed during the queue engine rewrite, so the script was orphaned.
Wires `npm run queue:clear` into `build-and-sync` so each rebuild
starts with a clean queue.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* refactor(worker): collapse parser to binary valid/invalid + clearPendingForSession model
- Parser: { valid: true, observations, summary } | { valid: false } — drops kind/skipped enum dispatch
- ResponseProcessor: two branches only (parseable → store + clearPendingForSession; else → no-op)
- Drop processingMessageIds + per-message claim/confirm/markFailed lifecycle across 3 providers
- PendingMessageStore: 226 → 140 lines; remove markFailed/transitionMessagesTo/confirmProcessed/clearFailedOlderThan/getAllPending/peekPendingTypes... wait keep peekPendingTypes
- Schema migration v31+v32: drop retry_count, failed_at_epoch, completed_at_epoch, worker_pid columns
- SessionQueueProcessor: delete two 1s recovery sleeps (let iterator end on error)
- Server.ts/SettingsRoutes.ts: replace four magic-number setTimeout exit-flush patterns with flushResponseThen helper
- GeneratorExitHandler: 183 → 117 lines (drain in-flight loop gone)
Net: -181 lines. No more silent data loss via maxRetries=3.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(pr-2255): address review comments batch 1
- install.ts: needsMarketplace true when claude-code selected (P1, was no-op)
- install.ts: throw on invalid --model so CLI exits non-zero
- install.ts: skip worker health checks + adapt next-step copy when --no-auto-start
- install.ts: repair regenerates plugin cache when missing
- index.ts: readFlag rejects missing/flag-shaped values
- index.ts: route flag-first invocations (e.g. `--provider claude`) to install
- banner.ts: fail-open if frame payload decode throws
- SearchRoutes.ts: 5s TTL cache for settings reads on hot hook path (P2)
- detect-error-handling-antipatterns.ts: trailing-brace strip whitespace-tolerant
- investigate-timestamps.ts: compute Dec 2025 epochs at runtime (was Dec 2024)
- regenerate-claude-md.ts: include workingDir in fallback walker so root is covered
- sync-marketplace.cjs: parseWorkerPort validates 1..65535 before http.request
- sync-to-marketplace.sh: resolve SOURCE_DIR from script location, not cwd
- Dockerfile.test-installer: bash --login sources .bashrc via .bash_profile
- docs/configuration.mdx: drop nonexistent .worker.port file refs, use settings.json
- docs/architecture-overview.md: dynamic port + queue model after parser collapse
- docs/architecture/worker-service.mdx: dynamic port example + drop port-file claim
- docs/platform-integration.mdx: WORKER_BASE_URL pattern, drop hardcoded 37777
- install/public/install.sh: Node 20 floor (was 18) to match docs
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(pr-2255): reset claimed messages to pending on early-return paths
ResponseProcessor returns early in two cases:
- parser invalid (unparseable response)
- memorySessionId not yet captured
Both paths previously left the just-claimed message in `status='processing'`,
which counts toward `getPendingCount`. The generator-exit handler then sees
`pendingCount > 0` and respawns the generator, looping until the restart
guard trips and `clearPendingForSession` deletes the message — silent data
loss.
Calling `resetProcessingToPending` on these paths lets the next generator
pass re-claim the message and try again, instead of burning the restart
budget on no-op respawns.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(pr-2255): swebench fallback row + troubleshooting port path
- evals/swebench/run-batch.py: append fallback prediction row when
orchestrator future raises, preserving "never drop an instance" guarantee
- docs/troubleshooting.mdx: drop nonexistent .worker.port / worker.port file
references; use settings.json + /api/health for port discovery
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(pr-2255): memoize per-project observation count for welcome-hint hot path
handleContextInject runs on every PostToolUse hook (after every Read/Edit).
The welcome-hint block ran a COUNT(*) on observations for every call once
CLAUDE_MEM_WELCOME_HINT_ENABLED was true. Observation counts are
monotonically increasing — once a project has any observations it always
will — so cache the positive result in a Set and skip the COUNT(*) on
subsequent requests.
Combined with the 5s settings TTL added earlier, the steady-state cost on
the hook hot path drops to a Set lookup.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* fix(pr-2255): use clearProcessingForSession on AI-success path
clearPendingForSession deletes ALL rows for the session. On the success
path of processAgentResponse, that's wrong: messages that arrived as
'pending' during the (1-5s) AI response latency get deleted along with
the 'processing' row we just consumed. In a hook burst (three quick
PostToolUse hooks), B and C land while A is in flight; A's success then
nukes B and C — silent data loss.
Add a status-scoped clearProcessingForSession to PendingMessageStore +
SessionManager, and use it in ResponseProcessor's success path. The
unconditional clearPendingForSession remains correct in
GeneratorExitHandler for hard-stop / restart-guard-trip paths.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* Revert "fix(pr-2255): use clearProcessingForSession on AI-success path"
This reverts commit a08995299c30cbad36bddc3e5bddda7af8604b35.
---------
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
786 lines
88 KiB
JavaScript
786 lines
88 KiB
JavaScript
"use strict";var Mt=Object.create;var H=Object.defineProperty;var Dt=Object.getOwnPropertyDescriptor;var vt=Object.getOwnPropertyNames;var yt=Object.getPrototypeOf,Ut=Object.prototype.hasOwnProperty;var xt=(r,e)=>{for(var t in e)H(r,t,{get:e[t],enumerable:!0})},be=(r,e,t,s)=>{if(e&&typeof e=="object"||typeof e=="function")for(let n of vt(e))!Ut.call(r,n)&&n!==t&&H(r,n,{get:()=>e[n],enumerable:!(s=Dt(e,n))||s.enumerable});return r};var M=(r,e,t)=>(t=r!=null?Mt(yt(r)):{},be(e||!r||!r.__esModule?H(t,"default",{value:r,enumerable:!0}):t,r)),wt=r=>be(H({},"__esModule",{value:!0}),r);var os={};xt(os,{generateContext:()=>fe});module.exports=wt(os);var Ct=M(require("path"),1),It=require("os"),Lt=require("fs");var oe=require("bun:sqlite");var h=require("path"),te=require("os"),G=require("fs");var Re=require("url");var L=require("fs"),U=require("path"),Oe=require("os"),Z=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(Z||{}),he=(0,U.join)((0,Oe.homedir)(),".claude-mem"),ee=class{level=null;useColor;logFilePath=null;logFileInitialized=!1;constructor(){this.useColor=process.stdout.isTTY??!1}ensureLogFileInitialized(){if(!this.logFileInitialized){this.logFileInitialized=!0;try{let e=(0,U.join)(he,"logs");(0,L.existsSync)(e)||(0,L.mkdirSync)(e,{recursive:!0});let t=new Date().toISOString().split("T")[0];this.logFilePath=(0,U.join)(e,`claude-mem-${t}.log`)}catch(e){console.error("[LOGGER] Failed to initialize log file:",e instanceof Error?e.message:String(e)),this.logFilePath=null}}}getLevel(){if(this.level===null)try{let e=(0,U.join)(he,"settings.json");if((0,L.existsSync)(e)){let t=(0,L.readFileSync)(e,"utf-8"),n=(JSON.parse(t).CLAUDE_MEM_LOG_LEVEL||"INFO").toUpperCase();this.level=Z[n]??1}else this.level=1}catch(e){console.error("[LOGGER] Failed to load log level from settings:",e instanceof Error?e.message:String(e)),this.level=1}return this.level}correlationId(e,t){return`obs-${e}-${t}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.getLevel()===0?`${e.message}
|
|
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Object.keys(e);return t.length===0?"{}":t.length<=3?JSON.stringify(e):`{${t.length} keys: ${t.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,t){if(!t)return e;let s=t;if(typeof t=="string")try{s=JSON.parse(t)}catch{s=t}if(e==="Bash"&&s.command)return`${e}(${s.command})`;if(s.file_path)return`${e}(${s.file_path})`;if(s.notebook_path)return`${e}(${s.notebook_path})`;if(e==="Glob"&&s.pattern)return`${e}(${s.pattern})`;if(e==="Grep"&&s.pattern)return`${e}(${s.pattern})`;if(s.url)return`${e}(${s.url})`;if(s.query)return`${e}(${s.query})`;if(e==="Task"){if(s.subagent_type)return`${e}(${s.subagent_type})`;if(s.description)return`${e}(${s.description})`}return e==="Skill"&&s.skill?`${e}(${s.skill})`:e==="LSP"&&s.operation?`${e}(${s.operation})`:e}formatTimestamp(e){let t=e.getFullYear(),s=String(e.getMonth()+1).padStart(2,"0"),n=String(e.getDate()).padStart(2,"0"),o=String(e.getHours()).padStart(2,"0"),i=String(e.getMinutes()).padStart(2,"0"),a=String(e.getSeconds()).padStart(2,"0"),d=String(e.getMilliseconds()).padStart(3,"0");return`${t}-${s}-${n} ${o}:${i}:${a}.${d}`}log(e,t,s,n,o){if(e<this.getLevel())return;this.ensureLogFileInitialized();let i=this.formatTimestamp(new Date),a=Z[e].padEnd(5),d=t.padEnd(6),c="";n?.correlationId?c=`[${n.correlationId}] `:n?.sessionId&&(c=`[session-${n.sessionId}] `);let m="";if(o!=null)if(o instanceof Error)m=this.getLevel()===0?`
|
|
${o.message}
|
|
${o.stack}`:` ${o.message}`;else if(this.getLevel()===0&&typeof o=="object")try{m=`
|
|
`+JSON.stringify(o,null,2)}catch{m=" "+this.formatData(o)}else m=" "+this.formatData(o);let T="";if(n){let{sessionId:g,memorySessionId:f,correlationId:S,...p}=n;Object.keys(p).length>0&&(T=` {${Object.entries(p).map(([b,l])=>`${b}=${l}`).join(", ")}}`)}let E=`[${i}] [${a}] [${d}] ${c}${s}${T}${m}`;if(this.logFilePath)try{(0,L.appendFileSync)(this.logFilePath,E+`
|
|
`,"utf8")}catch(g){process.stderr.write(`[LOGGER] Failed to write to log file: ${g instanceof Error?g.message:String(g)}
|
|
`)}else process.stderr.write(E+`
|
|
`)}debug(e,t,s,n){this.log(0,e,t,s,n)}info(e,t,s,n){this.log(1,e,t,s,n)}warn(e,t,s,n){this.log(2,e,t,s,n)}error(e,t,s,n){this.log(3,e,t,s,n)}dataIn(e,t,s,n){this.info(e,`\u2192 ${t}`,s,n)}dataOut(e,t,s,n){this.info(e,`\u2190 ${t}`,s,n)}success(e,t,s,n){this.info(e,`\u2713 ${t}`,s,n)}failure(e,t,s,n){this.error(e,`\u2717 ${t}`,s,n)}timing(e,t,s,n){this.info(e,`\u23F1 ${t}`,n,{duration:`${s}ms`})}happyPathError(e,t,s,n,o=""){let c=((new Error().stack||"").split(`
|
|
`)[2]||"").match(/at\s+(?:.*\s+)?\(?([^:]+):(\d+):(\d+)\)?/),m=c?`${c[1].split("/").pop()}:${c[2]}`:"unknown",T={...s,location:m};return this.warn(e,`[HAPPY-PATH] ${t}`,T,n),o}},u=new ee;var Ht={};function kt(){return typeof __dirname<"u"?__dirname:(0,h.dirname)((0,Re.fileURLToPath)(Ht.url))}var Ft=kt();function $t(){if(process.env.CLAUDE_MEM_DATA_DIR)return process.env.CLAUDE_MEM_DATA_DIR;let r=(0,h.join)((0,te.homedir)(),".claude-mem"),e=(0,h.join)(r,"settings.json");try{if((0,G.existsSync)(e)){let{readFileSync:t}=require("fs"),s=JSON.parse(t(e,"utf-8")),n=s.env??s;if(n.CLAUDE_MEM_DATA_DIR)return n.CLAUDE_MEM_DATA_DIR}}catch{}return r}var N=$t(),D=process.env.CLAUDE_CONFIG_DIR||(0,h.join)((0,te.homedir)(),".claude"),us=(0,h.join)(D,"plugins","marketplaces","thedotmack"),ms=(0,h.join)(N,"archives"),cs=(0,h.join)(N,"logs"),ls=(0,h.join)(N,"trash"),ps=(0,h.join)(N,"backups"),Es=(0,h.join)(N,"modes"),gs=(0,h.join)(N,"settings.json"),Ae=(0,h.join)(N,"claude-mem.db"),Ts=(0,h.join)(N,"vector-db"),Pt=(0,h.join)(N,"observer-sessions"),se=(0,h.basename)(Pt),fs=(0,h.join)(D,"settings.json"),Ss=(0,h.join)(D,"commands"),bs=(0,h.join)(D,"CLAUDE.md");function Ne(r){(0,G.mkdirSync)(r,{recursive:!0})}function Ce(){return(0,h.join)(Ft,"..")}var ve=require("crypto");var Le=require("os"),Me=M(require("path"),1);var j=require("fs"),X=M(require("path"),1),x={isWorktree:!1,worktreeName:null,parentRepoPath:null,parentProjectName:null};function Ie(r){let e=X.default.join(r,".git"),t;try{t=(0,j.statSync)(e)}catch(m){return m instanceof Error&&m.code!=="ENOENT"&&console.warn("[worktree] Unexpected error checking .git:",m),x}if(!t.isFile())return x;let s;try{s=(0,j.readFileSync)(e,"utf-8").trim()}catch(m){return console.warn("[worktree] Failed to read .git file:",m instanceof Error?m.message:String(m)),x}let n=s.match(/^gitdir:\s*(.+)$/);if(!n)return x;let i=n[1].match(/^(.+)[/\\]\.git[/\\]worktrees[/\\]([^/\\]+)$/);if(!i)return x;let a=i[1],d=X.default.basename(r),c=X.default.basename(a);return{isWorktree:!0,worktreeName:d,parentRepoPath:a,parentProjectName:c}}function De(r){return r==="~"||r.startsWith("~/")?r.replace(/^~/,(0,Le.homedir)()):r}function Gt(r){if(!r||r.trim()==="")return u.warn("PROJECT_NAME","Empty cwd provided, using fallback",{cwd:r}),"unknown-project";let e=De(r),t=Me.default.basename(e);if(t===""){if(process.platform==="win32"){let n=r.match(/^([A-Z]):\\/i);if(n){let i=`drive-${n[1].toUpperCase()}`;return u.info("PROJECT_NAME","Drive root detected",{cwd:r,projectName:i}),i}}return u.warn("PROJECT_NAME","Root directory detected, using fallback",{cwd:r}),"unknown-project"}return t}function re(r){let e=Gt(r);if(!r)return{primary:e,parent:null,isWorktree:!1,allProjects:[e]};let t=De(r),s=Ie(t);if(s.isWorktree&&s.parentProjectName){let n=`${s.parentProjectName}/${e}`;return{primary:n,parent:s.parentProjectName,isWorktree:!0,allProjects:[s.parentProjectName,n]}}return{primary:e,parent:null,isWorktree:!1,allProjects:[e]}}function B(r,e,t){return(0,ve.createHash)("sha256").update([r||"",e||"",t||""].join("\0")).digest("hex").slice(0,16)}function ne(r){if(!r)return[];try{let e=JSON.parse(r);return Array.isArray(e)?e:[String(e)]}catch{return[r]}}var R="claude";function Xt(r){return r.trim().toLowerCase().replace(/\s+/g,"-")}function v(r){if(!r)return R;let e=Xt(r);return e?e==="transcript"||e.includes("codex")?"codex":e.includes("cursor")?"cursor":e.includes("claude")?"claude":e:R}function ye(r){let e=["claude","codex","cursor"];return[...r].sort((t,s)=>{let n=e.indexOf(t),o=e.indexOf(s);return n!==-1||o!==-1?n===-1?1:o===-1?-1:n-o:t.localeCompare(s)})}function jt(r,e){return{customTitle:r,platformSource:e?v(e):void 0}}var W=class{db;constructor(e=Ae){e instanceof oe.Database?this.db=e:(e!==":memory:"&&Ne(N),this.db=new oe.Database(e),this.db.run("PRAGMA journal_mode = WAL"),this.db.run("PRAGMA synchronous = NORMAL"),this.db.run("PRAGMA foreign_keys = ON"),this.db.run("PRAGMA journal_size_limit = 4194304")),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable(),this.ensureDiscoveryTokensColumn(),this.createPendingMessagesTable(),this.renameSessionIdColumns(),this.repairSessionIdColumnRename(),this.addFailedAtEpochColumn(),this.addOnUpdateCascadeToForeignKeys(),this.addObservationContentHashColumn(),this.addSessionCustomTitleColumn(),this.addSessionPlatformSourceColumn(),this.addObservationModelColumns(),this.ensureMergedIntoProjectColumns(),this.addObservationSubagentColumns(),this.addPendingMessagesToolUseIdAndWorkerPidColumns(),this.addObservationsUniqueContentHashIndex(),this.addObservationsMetadataColumn(),this.dropDeadPendingMessagesColumns(),this.dropWorkerPidColumn()}dropWorkerPidColumn(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(32))return;if(this.db.query("PRAGMA table_info(pending_messages)").all().some(n=>n.name==="worker_pid"))try{this.db.run("DROP INDEX IF EXISTS idx_pending_messages_worker_pid"),this.db.run("ALTER TABLE pending_messages DROP COLUMN worker_pid"),u.debug("DB","Dropped worker_pid column and its index from pending_messages")}catch(n){u.warn("DB","Failed to drop worker_pid column from pending_messages",{},n instanceof Error?n:new Error(String(n)))}this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(32,new Date().toISOString())}dropDeadPendingMessagesColumns(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(31))return;let t=this.db.query("PRAGMA table_info(pending_messages)").all(),s=new Set(t.map(i=>i.name)),o=["retry_count","failed_at_epoch","completed_at_epoch","worker_pid"].filter(i=>s.has(i));if(o.length>0){this.db.run("DELETE FROM pending_messages WHERE status NOT IN ('pending', 'processing')");for(let i of o)try{this.db.run(`ALTER TABLE pending_messages DROP COLUMN ${i}`),u.debug("DB",`Dropped dead column ${i} from pending_messages`)}catch(a){u.warn("DB",`Failed to drop column ${i} from pending_messages`,{},a instanceof Error?a:new Error(String(a)))}}this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(31,new Date().toISOString())}initializeSchema(){this.db.run(`
|
|
CREATE TABLE IF NOT EXISTS schema_versions (
|
|
id INTEGER PRIMARY KEY,
|
|
version INTEGER UNIQUE NOT NULL,
|
|
applied_at TEXT NOT NULL
|
|
)
|
|
`),this.db.run(`
|
|
CREATE TABLE IF NOT EXISTS sdk_sessions (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
content_session_id TEXT UNIQUE NOT NULL,
|
|
memory_session_id TEXT UNIQUE,
|
|
project TEXT NOT NULL,
|
|
platform_source TEXT NOT NULL DEFAULT 'claude',
|
|
user_prompt TEXT,
|
|
started_at TEXT NOT NULL,
|
|
started_at_epoch INTEGER NOT NULL,
|
|
completed_at TEXT,
|
|
completed_at_epoch INTEGER,
|
|
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
|
|
);
|
|
|
|
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
|
|
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
|
|
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
|
|
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
|
|
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
|
|
|
|
CREATE TABLE IF NOT EXISTS observations (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT NOT NULL,
|
|
project TEXT NOT NULL,
|
|
text TEXT NOT NULL,
|
|
type TEXT NOT NULL,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
|
|
);
|
|
|
|
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
|
|
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
|
|
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
|
|
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
|
|
|
|
CREATE TABLE IF NOT EXISTS session_summaries (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT UNIQUE NOT NULL,
|
|
project TEXT NOT NULL,
|
|
request TEXT,
|
|
investigated TEXT,
|
|
learned TEXT,
|
|
completed TEXT,
|
|
next_steps TEXT,
|
|
files_read TEXT,
|
|
files_edited TEXT,
|
|
notes TEXT,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
|
|
);
|
|
|
|
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
|
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
|
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
|
`),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString())}ensureWorkerPortColumn(){this.db.query("PRAGMA table_info(sdk_sessions)").all().some(s=>s.name==="worker_port")||(this.db.run("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),u.debug("DB","Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}ensurePromptTrackingColumns(){this.db.query("PRAGMA table_info(sdk_sessions)").all().some(a=>a.name==="prompt_counter")||(this.db.run("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),u.debug("DB","Added prompt_counter column to sdk_sessions table")),this.db.query("PRAGMA table_info(observations)").all().some(a=>a.name==="prompt_number")||(this.db.run("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),u.debug("DB","Added prompt_number column to observations table")),this.db.query("PRAGMA table_info(session_summaries)").all().some(a=>a.name==="prompt_number")||(this.db.run("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),u.debug("DB","Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}removeSessionSummariesUniqueConstraint(){if(!this.db.query("PRAGMA index_list(session_summaries)").all().some(s=>s.unique===1&&s.origin!=="pk")){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}u.debug("DB","Removing UNIQUE constraint from session_summaries.memory_session_id"),this.db.run("BEGIN TRANSACTION"),this.db.run("DROP TABLE IF EXISTS session_summaries_new"),this.db.run(`
|
|
CREATE TABLE session_summaries_new (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT NOT NULL,
|
|
project TEXT NOT NULL,
|
|
request TEXT,
|
|
investigated TEXT,
|
|
learned TEXT,
|
|
completed TEXT,
|
|
next_steps TEXT,
|
|
files_read TEXT,
|
|
files_edited TEXT,
|
|
notes TEXT,
|
|
prompt_number INTEGER,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
|
)
|
|
`),this.db.run(`
|
|
INSERT INTO session_summaries_new
|
|
SELECT id, memory_session_id, project, request, investigated, learned,
|
|
completed, next_steps, files_read, files_edited, notes,
|
|
prompt_number, created_at, created_at_epoch
|
|
FROM session_summaries
|
|
`),this.db.run("DROP TABLE session_summaries"),this.db.run("ALTER TABLE session_summaries_new RENAME TO session_summaries"),this.db.run(`
|
|
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
|
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
|
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
|
`),this.db.run("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString()),u.debug("DB","Successfully removed UNIQUE constraint from session_summaries.memory_session_id")}addObservationHierarchicalFields(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(8))return;if(this.db.query("PRAGMA table_info(observations)").all().some(n=>n.name==="title")){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(8,new Date().toISOString());return}u.debug("DB","Adding hierarchical fields to observations table"),this.db.run(`
|
|
ALTER TABLE observations ADD COLUMN title TEXT;
|
|
ALTER TABLE observations ADD COLUMN subtitle TEXT;
|
|
ALTER TABLE observations ADD COLUMN facts TEXT;
|
|
ALTER TABLE observations ADD COLUMN narrative TEXT;
|
|
ALTER TABLE observations ADD COLUMN concepts TEXT;
|
|
ALTER TABLE observations ADD COLUMN files_read TEXT;
|
|
ALTER TABLE observations ADD COLUMN files_modified TEXT;
|
|
`),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(8,new Date().toISOString()),u.debug("DB","Successfully added hierarchical fields to observations table")}makeObservationsTextNullable(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(9))return;let s=this.db.query("PRAGMA table_info(observations)").all().find(n=>n.name==="text");if(!s||s.notnull===0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(9,new Date().toISOString());return}u.debug("DB","Making observations.text nullable"),this.db.run("BEGIN TRANSACTION"),this.db.run("DROP TABLE IF EXISTS observations_new"),this.db.run(`
|
|
CREATE TABLE observations_new (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT NOT NULL,
|
|
project TEXT NOT NULL,
|
|
text TEXT,
|
|
type TEXT NOT NULL,
|
|
title TEXT,
|
|
subtitle TEXT,
|
|
facts TEXT,
|
|
narrative TEXT,
|
|
concepts TEXT,
|
|
files_read TEXT,
|
|
files_modified TEXT,
|
|
prompt_number INTEGER,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
|
)
|
|
`),this.db.run(`
|
|
INSERT INTO observations_new
|
|
SELECT id, memory_session_id, project, text, type, title, subtitle, facts,
|
|
narrative, concepts, files_read, files_modified, prompt_number,
|
|
created_at, created_at_epoch
|
|
FROM observations
|
|
`),this.db.run("DROP TABLE observations"),this.db.run("ALTER TABLE observations_new RENAME TO observations"),this.db.run(`
|
|
CREATE INDEX idx_observations_sdk_session ON observations(memory_session_id);
|
|
CREATE INDEX idx_observations_project ON observations(project);
|
|
CREATE INDEX idx_observations_type ON observations(type);
|
|
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
|
`),this.db.run("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(9,new Date().toISOString()),u.debug("DB","Successfully made observations.text nullable")}createUserPromptsTable(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(10))return;if(this.db.query("PRAGMA table_info(user_prompts)").all().length>0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(10,new Date().toISOString());return}u.debug("DB","Creating user_prompts table with FTS5 support"),this.db.run("BEGIN TRANSACTION"),this.db.run(`
|
|
CREATE TABLE user_prompts (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
content_session_id TEXT NOT NULL,
|
|
prompt_number INTEGER NOT NULL,
|
|
prompt_text TEXT NOT NULL,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(content_session_id) REFERENCES sdk_sessions(content_session_id) ON DELETE CASCADE
|
|
);
|
|
|
|
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(content_session_id);
|
|
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
|
|
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
|
|
CREATE INDEX idx_user_prompts_lookup ON user_prompts(content_session_id, prompt_number);
|
|
`);let s=`
|
|
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
|
|
prompt_text,
|
|
content='user_prompts',
|
|
content_rowid='id'
|
|
);
|
|
`,n=`
|
|
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
|
|
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
|
VALUES (new.id, new.prompt_text);
|
|
END;
|
|
|
|
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
|
|
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
|
VALUES('delete', old.id, old.prompt_text);
|
|
END;
|
|
|
|
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
|
|
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
|
VALUES('delete', old.id, old.prompt_text);
|
|
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
|
VALUES (new.id, new.prompt_text);
|
|
END;
|
|
`;try{this.db.run(s),this.db.run(n)}catch(o){o instanceof Error?u.warn("DB","FTS5 not available \u2014 user_prompts_fts skipped (search uses ChromaDB)",{},o):u.warn("DB","FTS5 not available \u2014 user_prompts_fts skipped (search uses ChromaDB)",{},new Error(String(o))),this.db.run("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(10,new Date().toISOString()),u.debug("DB","Created user_prompts table (without FTS5)");return}this.db.run("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(10,new Date().toISOString()),u.debug("DB","Successfully created user_prompts table")}ensureDiscoveryTokensColumn(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(11))return;this.db.query("PRAGMA table_info(observations)").all().some(i=>i.name==="discovery_tokens")||(this.db.run("ALTER TABLE observations ADD COLUMN discovery_tokens INTEGER DEFAULT 0"),u.debug("DB","Added discovery_tokens column to observations table")),this.db.query("PRAGMA table_info(session_summaries)").all().some(i=>i.name==="discovery_tokens")||(this.db.run("ALTER TABLE session_summaries ADD COLUMN discovery_tokens INTEGER DEFAULT 0"),u.debug("DB","Added discovery_tokens column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(11,new Date().toISOString())}createPendingMessagesTable(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(16))return;if(this.db.query("SELECT name FROM sqlite_master WHERE type='table' AND name='pending_messages'").all().length>0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(16,new Date().toISOString());return}u.debug("DB","Creating pending_messages table"),this.db.run(`
|
|
CREATE TABLE pending_messages (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
session_db_id INTEGER NOT NULL,
|
|
content_session_id TEXT NOT NULL,
|
|
message_type TEXT NOT NULL CHECK(message_type IN ('observation', 'summarize')),
|
|
tool_name TEXT,
|
|
tool_input TEXT,
|
|
tool_response TEXT,
|
|
cwd TEXT,
|
|
last_user_message TEXT,
|
|
last_assistant_message TEXT,
|
|
prompt_number INTEGER,
|
|
status TEXT NOT NULL DEFAULT 'pending' CHECK(status IN ('pending', 'processing')),
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY (session_db_id) REFERENCES sdk_sessions(id) ON DELETE CASCADE
|
|
)
|
|
`),this.db.run("CREATE INDEX IF NOT EXISTS idx_pending_messages_session ON pending_messages(session_db_id)"),this.db.run("CREATE INDEX IF NOT EXISTS idx_pending_messages_status ON pending_messages(status)"),this.db.run("CREATE INDEX IF NOT EXISTS idx_pending_messages_claude_session ON pending_messages(content_session_id)"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(16,new Date().toISOString()),u.debug("DB","pending_messages table created successfully")}renameSessionIdColumns(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(17))return;u.debug("DB","Checking session ID columns for semantic clarity rename");let t=0,s=(n,o,i)=>{let a=this.db.query(`PRAGMA table_info(${n})`).all(),d=a.some(m=>m.name===o);return a.some(m=>m.name===i)?!1:d?(this.db.run(`ALTER TABLE ${n} RENAME COLUMN ${o} TO ${i}`),u.debug("DB",`Renamed ${n}.${o} to ${i}`),!0):(u.warn("DB",`Column ${o} not found in ${n}, skipping rename`),!1)};s("sdk_sessions","claude_session_id","content_session_id")&&t++,s("sdk_sessions","sdk_session_id","memory_session_id")&&t++,s("pending_messages","claude_session_id","content_session_id")&&t++,s("observations","sdk_session_id","memory_session_id")&&t++,s("session_summaries","sdk_session_id","memory_session_id")&&t++,s("user_prompts","claude_session_id","content_session_id")&&t++,this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(17,new Date().toISOString()),t>0?u.debug("DB",`Successfully renamed ${t} session ID columns`):u.debug("DB","No session ID column renames needed (already up to date)")}repairSessionIdColumnRename(){this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(19)||this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(19,new Date().toISOString())}addFailedAtEpochColumn(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(20))return;this.db.query("PRAGMA table_info(pending_messages)").all().some(n=>n.name==="failed_at_epoch")||(this.db.run("ALTER TABLE pending_messages ADD COLUMN failed_at_epoch INTEGER"),u.debug("DB","Added failed_at_epoch column to pending_messages table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(20,new Date().toISOString())}addOnUpdateCascadeToForeignKeys(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(21))return;u.debug("DB","Adding ON UPDATE CASCADE to FK constraints on observations and session_summaries"),this.db.run("PRAGMA foreign_keys = OFF"),this.db.run("BEGIN TRANSACTION"),this.db.run("DROP TRIGGER IF EXISTS observations_ai"),this.db.run("DROP TRIGGER IF EXISTS observations_ad"),this.db.run("DROP TRIGGER IF EXISTS observations_au"),this.db.run("DROP TABLE IF EXISTS observations_new");let s=this.db.query("PRAGMA table_info(observations)").all().some(f=>f.name==="metadata"),n=s?`,
|
|
metadata TEXT`:"",o=s?", metadata":"",i=`
|
|
CREATE TABLE observations_new (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT NOT NULL,
|
|
project TEXT NOT NULL,
|
|
text TEXT,
|
|
type TEXT NOT NULL,
|
|
title TEXT,
|
|
subtitle TEXT,
|
|
facts TEXT,
|
|
narrative TEXT,
|
|
concepts TEXT,
|
|
files_read TEXT,
|
|
files_modified TEXT,
|
|
prompt_number INTEGER,
|
|
discovery_tokens INTEGER DEFAULT 0,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL${n},
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
|
|
)
|
|
`,a=`
|
|
INSERT INTO observations_new
|
|
SELECT id, memory_session_id, project, text, type, title, subtitle, facts,
|
|
narrative, concepts, files_read, files_modified, prompt_number,
|
|
discovery_tokens, created_at, created_at_epoch${o}
|
|
FROM observations
|
|
`,d=`
|
|
CREATE INDEX idx_observations_sdk_session ON observations(memory_session_id);
|
|
CREATE INDEX idx_observations_project ON observations(project);
|
|
CREATE INDEX idx_observations_type ON observations(type);
|
|
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
|
`,c=`
|
|
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
|
|
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
|
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
|
|
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
|
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
|
|
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
|
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
|
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
|
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
|
END;
|
|
`;this.db.run("DROP TRIGGER IF EXISTS session_summaries_ai"),this.db.run("DROP TRIGGER IF EXISTS session_summaries_ad"),this.db.run("DROP TRIGGER IF EXISTS session_summaries_au"),this.db.run("DROP TABLE IF EXISTS session_summaries_new");let m=`
|
|
CREATE TABLE session_summaries_new (
|
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
memory_session_id TEXT NOT NULL,
|
|
project TEXT NOT NULL,
|
|
request TEXT,
|
|
investigated TEXT,
|
|
learned TEXT,
|
|
completed TEXT,
|
|
next_steps TEXT,
|
|
files_read TEXT,
|
|
files_edited TEXT,
|
|
notes TEXT,
|
|
prompt_number INTEGER,
|
|
discovery_tokens INTEGER DEFAULT 0,
|
|
created_at TEXT NOT NULL,
|
|
created_at_epoch INTEGER NOT NULL,
|
|
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
|
|
)
|
|
`,T=`
|
|
INSERT INTO session_summaries_new
|
|
SELECT id, memory_session_id, project, request, investigated, learned,
|
|
completed, next_steps, files_read, files_edited, notes,
|
|
prompt_number, discovery_tokens, created_at, created_at_epoch
|
|
FROM session_summaries
|
|
`,E=`
|
|
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
|
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
|
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
|
`,g=`
|
|
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
|
|
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
|
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
|
|
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
|
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
|
END;
|
|
|
|
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
|
|
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
|
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
|
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
|
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
|
END;
|
|
`;try{this.recreateObservationsWithCascade(i,a,d,c),this.recreateSessionSummariesWithCascade(m,T,E,g),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(21,new Date().toISOString()),this.db.run("COMMIT"),this.db.run("PRAGMA foreign_keys = ON"),u.debug("DB","Successfully added ON UPDATE CASCADE to FK constraints")}catch(f){throw this.db.run("ROLLBACK"),this.db.run("PRAGMA foreign_keys = ON"),f instanceof Error?f:new Error(String(f))}}recreateObservationsWithCascade(e,t,s,n){this.db.run(e),this.db.run(t),this.db.run("DROP TABLE observations"),this.db.run("ALTER TABLE observations_new RENAME TO observations"),this.db.run(s),this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name='observations_fts'").all().length>0&&this.db.run(n)}recreateSessionSummariesWithCascade(e,t,s,n){this.db.run(e),this.db.run(t),this.db.run("DROP TABLE session_summaries"),this.db.run("ALTER TABLE session_summaries_new RENAME TO session_summaries"),this.db.run(s),this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name='session_summaries_fts'").all().length>0&&this.db.run(n)}addObservationContentHashColumn(){if(this.db.query("PRAGMA table_info(observations)").all().some(s=>s.name==="content_hash")){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(22,new Date().toISOString());return}this.db.run("ALTER TABLE observations ADD COLUMN content_hash TEXT"),this.db.run("UPDATE observations SET content_hash = substr(hex(randomblob(8)), 1, 16) WHERE content_hash IS NULL"),this.db.run("CREATE INDEX IF NOT EXISTS idx_observations_content_hash ON observations(content_hash, created_at_epoch)"),u.debug("DB","Added content_hash column to observations table with backfill and index"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(22,new Date().toISOString())}addSessionCustomTitleColumn(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(23))return;this.db.query("PRAGMA table_info(sdk_sessions)").all().some(n=>n.name==="custom_title")||(this.db.run("ALTER TABLE sdk_sessions ADD COLUMN custom_title TEXT"),u.debug("DB","Added custom_title column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(23,new Date().toISOString())}addSessionPlatformSourceColumn(){let t=this.db.query("PRAGMA table_info(sdk_sessions)").all().some(i=>i.name==="platform_source"),n=this.db.query("PRAGMA index_list(sdk_sessions)").all().some(i=>i.name==="idx_sdk_sessions_platform_source");this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(24)&&t&&n||(t||(this.db.run(`ALTER TABLE sdk_sessions ADD COLUMN platform_source TEXT NOT NULL DEFAULT '${R}'`),u.debug("DB","Added platform_source column to sdk_sessions table")),this.db.run(`
|
|
UPDATE sdk_sessions
|
|
SET platform_source = '${R}'
|
|
WHERE platform_source IS NULL OR platform_source = ''
|
|
`),n||this.db.run("CREATE INDEX IF NOT EXISTS idx_sdk_sessions_platform_source ON sdk_sessions(platform_source)"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(24,new Date().toISOString()))}addObservationModelColumns(){let e=this.db.query("PRAGMA table_info(observations)").all(),t=e.some(n=>n.name==="generated_by_model"),s=e.some(n=>n.name==="relevance_count");t&&s||(t||this.db.run("ALTER TABLE observations ADD COLUMN generated_by_model TEXT"),s||this.db.run("ALTER TABLE observations ADD COLUMN relevance_count INTEGER DEFAULT 0"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(26,new Date().toISOString()))}ensureMergedIntoProjectColumns(){this.db.query("PRAGMA table_info(observations)").all().some(s=>s.name==="merged_into_project")||this.db.run("ALTER TABLE observations ADD COLUMN merged_into_project TEXT"),this.db.run("CREATE INDEX IF NOT EXISTS idx_observations_merged_into ON observations(merged_into_project)"),this.db.query("PRAGMA table_info(session_summaries)").all().some(s=>s.name==="merged_into_project")||this.db.run("ALTER TABLE session_summaries ADD COLUMN merged_into_project TEXT"),this.db.run("CREATE INDEX IF NOT EXISTS idx_summaries_merged_into ON session_summaries(merged_into_project)")}addObservationSubagentColumns(){let e=this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(27),t=this.db.query("PRAGMA table_info(observations)").all(),s=t.some(i=>i.name==="agent_type"),n=t.some(i=>i.name==="agent_id");s||this.db.run("ALTER TABLE observations ADD COLUMN agent_type TEXT"),n||this.db.run("ALTER TABLE observations ADD COLUMN agent_id TEXT"),this.db.run("CREATE INDEX IF NOT EXISTS idx_observations_agent_type ON observations(agent_type)"),this.db.run("CREATE INDEX IF NOT EXISTS idx_observations_agent_id ON observations(agent_id)");let o=this.db.query("PRAGMA table_info(pending_messages)").all();if(o.length>0){let i=o.some(d=>d.name==="agent_type"),a=o.some(d=>d.name==="agent_id");i||this.db.run("ALTER TABLE pending_messages ADD COLUMN agent_type TEXT"),a||this.db.run("ALTER TABLE pending_messages ADD COLUMN agent_id TEXT")}e||this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(27,new Date().toISOString())}addPendingMessagesToolUseIdAndWorkerPidColumns(){if(this.db.query("SELECT name FROM sqlite_master WHERE type='table' AND name='pending_messages'").all().length===0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(28,new Date().toISOString());return}let t=this.db.query("PRAGMA table_info(pending_messages)").all(),s=t.some(o=>o.name==="tool_use_id"),n=t.some(o=>o.name==="worker_pid");s||this.db.run("ALTER TABLE pending_messages ADD COLUMN tool_use_id TEXT"),n||this.db.run("ALTER TABLE pending_messages ADD COLUMN worker_pid INTEGER"),this.db.run("BEGIN TRANSACTION");try{this.db.run("CREATE INDEX IF NOT EXISTS idx_pending_messages_worker_pid ON pending_messages(worker_pid)"),this.db.run(`
|
|
DELETE FROM pending_messages
|
|
WHERE tool_use_id IS NOT NULL
|
|
AND id NOT IN (
|
|
SELECT MIN(id) FROM pending_messages
|
|
WHERE tool_use_id IS NOT NULL
|
|
GROUP BY content_session_id, tool_use_id
|
|
)
|
|
`),this.db.run(`
|
|
CREATE UNIQUE INDEX IF NOT EXISTS ux_pending_session_tool
|
|
ON pending_messages(content_session_id, tool_use_id)
|
|
WHERE tool_use_id IS NOT NULL
|
|
`),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(28,new Date().toISOString()),this.db.run("COMMIT")}catch(o){throw this.db.run("ROLLBACK"),o}}addObservationsUniqueContentHashIndex(){if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(29))return;let t=this.db.query("PRAGMA table_info(observations)").all(),s=t.some(o=>o.name==="memory_session_id"),n=t.some(o=>o.name==="content_hash");if(!s||!n){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(29,new Date().toISOString());return}this.db.run("BEGIN TRANSACTION");try{this.db.run(`
|
|
DELETE FROM observations
|
|
WHERE id NOT IN (
|
|
SELECT MIN(id) FROM observations
|
|
GROUP BY memory_session_id, content_hash
|
|
)
|
|
`),this.db.run(`
|
|
CREATE UNIQUE INDEX IF NOT EXISTS ux_observations_session_hash
|
|
ON observations(memory_session_id, content_hash)
|
|
`),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(29,new Date().toISOString()),this.db.run("COMMIT")}catch(o){throw this.db.run("ROLLBACK"),o}}addObservationsMetadataColumn(){this.db.query("PRAGMA table_info(observations)").all().some(s=>s.name==="metadata")||(this.db.run("ALTER TABLE observations ADD COLUMN metadata TEXT"),u.debug("DB","Added metadata column to observations table (#2116)")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(30,new Date().toISOString())}updateMemorySessionId(e,t){this.db.prepare(`
|
|
UPDATE sdk_sessions
|
|
SET memory_session_id = ?
|
|
WHERE id = ?
|
|
`).run(t,e)}markSessionCompleted(e){let t=Date.now(),s=new Date(t).toISOString();this.db.prepare(`
|
|
UPDATE sdk_sessions
|
|
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
|
WHERE id = ?
|
|
`).run(s,t,e)}ensureMemorySessionIdRegistered(e,t){let s=this.db.prepare(`
|
|
SELECT id, memory_session_id FROM sdk_sessions WHERE id = ?
|
|
`).get(e);if(!s)throw new Error(`Session ${e} not found in sdk_sessions`);s.memory_session_id!==t&&(this.db.prepare(`
|
|
UPDATE sdk_sessions SET memory_session_id = ? WHERE id = ?
|
|
`).run(t,e),u.info("DB","Registered memory_session_id before storage (FK fix)",{sessionDbId:e,oldId:s.memory_session_id,newId:t}))}getRecentSummaries(e,t=10){return this.db.prepare(`
|
|
SELECT
|
|
request, investigated, learned, completed, next_steps,
|
|
files_read, files_edited, notes, prompt_number, created_at
|
|
FROM session_summaries
|
|
WHERE project = ?
|
|
ORDER BY created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e,t)}getRecentSummariesWithSessionInfo(e,t=3){return this.db.prepare(`
|
|
SELECT
|
|
memory_session_id, request, learned, completed, next_steps,
|
|
prompt_number, created_at
|
|
FROM session_summaries
|
|
WHERE project = ?
|
|
ORDER BY created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e,t)}getRecentObservations(e,t=20){return this.db.prepare(`
|
|
SELECT type, text, prompt_number, created_at
|
|
FROM observations
|
|
WHERE project = ?
|
|
ORDER BY created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e,t)}getAllRecentObservations(e=100){return this.db.prepare(`
|
|
SELECT
|
|
o.id,
|
|
o.type,
|
|
o.title,
|
|
o.subtitle,
|
|
o.text,
|
|
o.project,
|
|
COALESCE(s.platform_source, '${R}') as platform_source,
|
|
o.prompt_number,
|
|
o.created_at,
|
|
o.created_at_epoch
|
|
FROM observations o
|
|
LEFT JOIN sdk_sessions s ON o.memory_session_id = s.memory_session_id
|
|
ORDER BY o.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e)}getAllRecentSummaries(e=50){return this.db.prepare(`
|
|
SELECT
|
|
ss.id,
|
|
ss.request,
|
|
ss.investigated,
|
|
ss.learned,
|
|
ss.completed,
|
|
ss.next_steps,
|
|
ss.files_read,
|
|
ss.files_edited,
|
|
ss.notes,
|
|
ss.project,
|
|
COALESCE(s.platform_source, '${R}') as platform_source,
|
|
ss.prompt_number,
|
|
ss.created_at,
|
|
ss.created_at_epoch
|
|
FROM session_summaries ss
|
|
LEFT JOIN sdk_sessions s ON ss.memory_session_id = s.memory_session_id
|
|
ORDER BY ss.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e)}getAllRecentUserPrompts(e=100){return this.db.prepare(`
|
|
SELECT
|
|
up.id,
|
|
up.content_session_id,
|
|
s.project,
|
|
COALESCE(s.platform_source, '${R}') as platform_source,
|
|
up.prompt_number,
|
|
up.prompt_text,
|
|
up.created_at,
|
|
up.created_at_epoch
|
|
FROM user_prompts up
|
|
LEFT JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
|
ORDER BY up.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e)}getAllProjects(e){let t=e?v(e):void 0,s=`
|
|
SELECT DISTINCT project
|
|
FROM sdk_sessions
|
|
WHERE project IS NOT NULL AND project != ''
|
|
AND project != ?
|
|
`,n=[se];return t&&(s+=" AND COALESCE(platform_source, ?) = ?",n.push(R,t)),s+=" ORDER BY project ASC",this.db.prepare(s).all(...n).map(i=>i.project)}getProjectCatalog(){let e=this.db.prepare(`
|
|
SELECT
|
|
COALESCE(platform_source, '${R}') as platform_source,
|
|
project,
|
|
MAX(started_at_epoch) as latest_epoch
|
|
FROM sdk_sessions
|
|
WHERE project IS NOT NULL AND project != ''
|
|
AND project != ?
|
|
GROUP BY COALESCE(platform_source, '${R}'), project
|
|
ORDER BY latest_epoch DESC
|
|
`).all(se),t=[],s=new Set,n={};for(let i of e){let a=v(i.platform_source);n[a]||(n[a]=[]),n[a].includes(i.project)||n[a].push(i.project),s.has(i.project)||(s.add(i.project),t.push(i.project))}let o=ye(Object.keys(n));return{projects:t,sources:o,projectsBySource:Object.fromEntries(o.map(i=>[i,n[i]||[]]))}}getLatestUserPrompt(e){return this.db.prepare(`
|
|
SELECT
|
|
up.*,
|
|
s.memory_session_id,
|
|
s.project,
|
|
COALESCE(s.platform_source, '${R}') as platform_source
|
|
FROM user_prompts up
|
|
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
|
WHERE up.content_session_id = ?
|
|
ORDER BY up.created_at_epoch DESC
|
|
LIMIT 1
|
|
`).get(e)}getRecentSessionsWithStatus(e,t=3){return this.db.prepare(`
|
|
SELECT * FROM (
|
|
SELECT
|
|
s.memory_session_id,
|
|
s.status,
|
|
s.started_at,
|
|
s.started_at_epoch,
|
|
s.user_prompt,
|
|
CASE WHEN sum.memory_session_id IS NOT NULL THEN 1 ELSE 0 END as has_summary
|
|
FROM sdk_sessions s
|
|
LEFT JOIN session_summaries sum ON s.memory_session_id = sum.memory_session_id
|
|
WHERE s.project = ? AND s.memory_session_id IS NOT NULL
|
|
GROUP BY s.memory_session_id
|
|
ORDER BY s.started_at_epoch DESC
|
|
LIMIT ?
|
|
)
|
|
ORDER BY started_at_epoch ASC
|
|
`).all(e,t)}getObservationsForSession(e){return this.db.prepare(`
|
|
SELECT title, subtitle, type, prompt_number
|
|
FROM observations
|
|
WHERE memory_session_id = ?
|
|
ORDER BY created_at_epoch ASC
|
|
`).all(e)}getObservationById(e){return this.db.prepare(`
|
|
SELECT *
|
|
FROM observations
|
|
WHERE id = ?
|
|
`).get(e)||null}getObservationsByIds(e,t={}){if(e.length===0)return[];let{orderBy:s="date_desc",limit:n,project:o,type:i,concepts:a,files:d}=t,c=s==="relevance",m=c?"":`ORDER BY created_at_epoch ${s==="date_asc"?"ASC":"DESC"}`,T=n?`LIMIT ${n}`:"",E=e.map(()=>"?").join(","),g=[...e],f=[];if(o&&(f.push("project = ?"),g.push(o)),i)if(Array.isArray(i)){let l=i.map(()=>"?").join(",");f.push(`type IN (${l})`),g.push(...i)}else f.push("type = ?"),g.push(i);if(a){let l=Array.isArray(a)?a:[a],I=l.map(()=>"EXISTS (SELECT 1 FROM json_each(concepts) WHERE value = ?)");g.push(...l),f.push(`(${I.join(" OR ")})`)}if(d){let l=Array.isArray(d)?d:[d],I=l.map(()=>"(EXISTS (SELECT 1 FROM json_each(files_read) WHERE value LIKE ?) OR EXISTS (SELECT 1 FROM json_each(files_modified) WHERE value LIKE ?))");l.forEach(y=>{g.push(`%${y}%`,`%${y}%`)}),f.push(`(${I.join(" OR ")})`)}let S=f.length>0?`WHERE id IN (${E}) AND ${f.join(" AND ")}`:`WHERE id IN (${E})`,O=this.db.prepare(`
|
|
SELECT *
|
|
FROM observations
|
|
${S}
|
|
${m}
|
|
${T}
|
|
`).all(...g);if(!c)return O;let b=new Map(O.map(l=>[l.id,l]));return e.map(l=>b.get(l)).filter(l=>!!l)}getSummaryForSession(e){return this.db.prepare(`
|
|
SELECT
|
|
request, investigated, learned, completed, next_steps,
|
|
files_read, files_edited, notes, prompt_number, created_at,
|
|
created_at_epoch
|
|
FROM session_summaries
|
|
WHERE memory_session_id = ?
|
|
ORDER BY created_at_epoch DESC
|
|
LIMIT 1
|
|
`).get(e)||null}getFilesForSession(e){let s=this.db.prepare(`
|
|
SELECT files_read, files_modified
|
|
FROM observations
|
|
WHERE memory_session_id = ?
|
|
`).all(e),n=new Set,o=new Set;for(let i of s)ne(i.files_read).forEach(a=>n.add(a)),ne(i.files_modified).forEach(a=>o.add(a));return{filesRead:Array.from(n),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
|
SELECT id, content_session_id, memory_session_id, project,
|
|
COALESCE(platform_source, '${R}') as platform_source,
|
|
user_prompt, custom_title, status
|
|
FROM sdk_sessions
|
|
WHERE id = ?
|
|
LIMIT 1
|
|
`).get(e)||null}getSdkSessionsBySessionIds(e){if(e.length===0)return[];let t=e.map(()=>"?").join(",");return this.db.prepare(`
|
|
SELECT id, content_session_id, memory_session_id, project,
|
|
COALESCE(platform_source, '${R}') as platform_source,
|
|
user_prompt, custom_title,
|
|
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
|
FROM sdk_sessions
|
|
WHERE memory_session_id IN (${t})
|
|
ORDER BY started_at_epoch DESC
|
|
`).all(...e)}getPromptNumberFromUserPrompts(e){return this.db.prepare(`
|
|
SELECT COUNT(*) as count FROM user_prompts WHERE content_session_id = ?
|
|
`).get(e).count}createSDKSession(e,t,s,n,o){let i=new Date,a=i.getTime(),d=jt(n,o),c=d.platformSource??R,m=this.db.prepare(`
|
|
SELECT id, platform_source FROM sdk_sessions WHERE content_session_id = ?
|
|
`).get(e);if(m){if(t&&this.db.prepare(`
|
|
UPDATE sdk_sessions SET project = ?
|
|
WHERE content_session_id = ? AND (project IS NULL OR project = '')
|
|
`).run(t,e),d.customTitle&&this.db.prepare(`
|
|
UPDATE sdk_sessions SET custom_title = ?
|
|
WHERE content_session_id = ? AND custom_title IS NULL
|
|
`).run(d.customTitle,e),d.platformSource){let E=m.platform_source?.trim()?v(m.platform_source):void 0;if(!E)this.db.prepare(`
|
|
UPDATE sdk_sessions SET platform_source = ?
|
|
WHERE content_session_id = ?
|
|
AND COALESCE(platform_source, '') = ''
|
|
`).run(d.platformSource,e);else if(E!==d.platformSource)throw new Error(`Platform source conflict for session ${e}: existing=${E}, received=${d.platformSource}`)}return m.id}return this.db.prepare(`
|
|
INSERT INTO sdk_sessions
|
|
(content_session_id, memory_session_id, project, platform_source, user_prompt, custom_title, started_at, started_at_epoch, status)
|
|
VALUES (?, NULL, ?, ?, ?, ?, ?, ?, 'active')
|
|
`).run(e,t,c,s,d.customTitle||null,i.toISOString(),a),this.db.prepare("SELECT id FROM sdk_sessions WHERE content_session_id = ?").get(e).id}saveUserPrompt(e,t,s){let n=new Date,o=n.getTime();return this.db.prepare(`
|
|
INSERT INTO user_prompts
|
|
(content_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
|
VALUES (?, ?, ?, ?, ?)
|
|
`).run(e,t,s,n.toISOString(),o).lastInsertRowid}getUserPrompt(e,t){return this.db.prepare(`
|
|
SELECT prompt_text
|
|
FROM user_prompts
|
|
WHERE content_session_id = ? AND prompt_number = ?
|
|
LIMIT 1
|
|
`).get(e,t)?.prompt_text??null}storeObservation(e,t,s,n,o=0,i,a){let d=i??Date.now(),c=new Date(d).toISOString(),m=B(e,s.title,s.narrative),E=this.db.prepare(`
|
|
INSERT INTO observations
|
|
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
|
files_read, files_modified, prompt_number, discovery_tokens, agent_type, agent_id, content_hash, created_at, created_at_epoch,
|
|
generated_by_model, metadata)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
ON CONFLICT(memory_session_id, content_hash) DO NOTHING
|
|
RETURNING id, created_at_epoch
|
|
`).get(e,t,s.type,s.title,s.subtitle,JSON.stringify(s.facts),s.narrative,JSON.stringify(s.concepts),JSON.stringify(s.files_read),JSON.stringify(s.files_modified),n||null,o,s.agent_type??null,s.agent_id??null,m,c,d,a||null,s.metadata??null);if(E)return{id:E.id,createdAtEpoch:E.created_at_epoch};let g=this.db.prepare("SELECT id, created_at_epoch FROM observations WHERE memory_session_id = ? AND content_hash = ?").get(e,m);if(!g)throw new Error(`storeObservation: ON CONFLICT without existing row for content_hash=${m}`);return{id:g.id,createdAtEpoch:g.created_at_epoch}}storeSummary(e,t,s,n,o=0,i){let a=i??Date.now(),d=new Date(a).toISOString(),m=this.db.prepare(`
|
|
INSERT INTO session_summaries
|
|
(memory_session_id, project, request, investigated, learned, completed,
|
|
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e,t,s.request,s.investigated,s.learned,s.completed,s.next_steps,s.notes,n||null,o,d,a);return{id:Number(m.lastInsertRowid),createdAtEpoch:a}}storeObservations(e,t,s,n,o,i=0,a,d){let c=a??Date.now(),m=new Date(c).toISOString();return this.db.transaction(()=>{let E=[],g=this.db.prepare(`
|
|
INSERT INTO observations
|
|
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
|
files_read, files_modified, prompt_number, discovery_tokens, agent_type, agent_id, content_hash, created_at, created_at_epoch,
|
|
generated_by_model)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
ON CONFLICT(memory_session_id, content_hash) DO NOTHING
|
|
RETURNING id
|
|
`),f=this.db.prepare("SELECT id FROM observations WHERE memory_session_id = ? AND content_hash = ?");for(let p of s){let O=B(e,p.title,p.narrative),b=g.get(e,t,p.type,p.title,p.subtitle,JSON.stringify(p.facts),p.narrative,JSON.stringify(p.concepts),JSON.stringify(p.files_read),JSON.stringify(p.files_modified),o||null,i,p.agent_type??null,p.agent_id??null,O,m,c,d||null);if(b){E.push(b.id);continue}let l=f.get(e,O);if(!l)throw new Error(`storeObservations: ON CONFLICT without existing row for content_hash=${O}`);E.push(l.id)}let S=null;if(n){let O=this.db.prepare(`
|
|
INSERT INTO session_summaries
|
|
(memory_session_id, project, request, investigated, learned, completed,
|
|
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e,t,n.request,n.investigated,n.learned,n.completed,n.next_steps,n.notes,o||null,i,m,c);S=Number(O.lastInsertRowid)}return{observationIds:E,summaryId:S,createdAtEpoch:c}})()}storeObservationsAndMarkComplete(e,t,s,n,o,i,a,d=0,c,m){let T=c??Date.now(),E=new Date(T).toISOString();return this.db.transaction(()=>{let f=[],S=this.db.prepare(`
|
|
INSERT INTO observations
|
|
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
|
files_read, files_modified, prompt_number, discovery_tokens, agent_type, agent_id, content_hash, created_at, created_at_epoch,
|
|
generated_by_model)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
ON CONFLICT(memory_session_id, content_hash) DO NOTHING
|
|
RETURNING id
|
|
`),p=this.db.prepare("SELECT id FROM observations WHERE memory_session_id = ? AND content_hash = ?");for(let l of s){let I=B(e,l.title,l.narrative),y=S.get(e,t,l.type,l.title,l.subtitle,JSON.stringify(l.facts),l.narrative,JSON.stringify(l.concepts),JSON.stringify(l.files_read),JSON.stringify(l.files_modified),a||null,d,l.agent_type??null,l.agent_id??null,I,E,T,m||null);if(y){f.push(y.id);continue}let Se=p.get(e,I);if(!Se)throw new Error(`storeObservationsAndMarkComplete: ON CONFLICT without existing row for content_hash=${I}`);f.push(Se.id)}let O;if(n){let I=this.db.prepare(`
|
|
INSERT INTO session_summaries
|
|
(memory_session_id, project, request, investigated, learned, completed,
|
|
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e,t,n.request,n.investigated,n.learned,n.completed,n.next_steps,n.notes,a||null,d,E,T);O=Number(I.lastInsertRowid)}return this.db.prepare(`
|
|
UPDATE pending_messages
|
|
SET
|
|
status = 'processed',
|
|
completed_at_epoch = ?,
|
|
tool_input = NULL,
|
|
tool_response = NULL
|
|
WHERE id = ? AND status = 'processing'
|
|
`).run(T,o),{observationIds:f,summaryId:O,createdAtEpoch:T}})()}getSessionSummariesByIds(e,t={}){if(e.length===0)return[];let{orderBy:s="date_desc",limit:n,project:o}=t,i=s==="relevance",a=i?"":`ORDER BY created_at_epoch ${s==="date_asc"?"ASC":"DESC"}`,d=n?`LIMIT ${n}`:"",c=e.map(()=>"?").join(","),m=[...e],T=o?`WHERE id IN (${c}) AND project = ?`:`WHERE id IN (${c})`;o&&m.push(o);let g=this.db.prepare(`
|
|
SELECT * FROM session_summaries
|
|
${T}
|
|
${a}
|
|
${d}
|
|
`).all(...m);if(!i)return g;let f=new Map(g.map(S=>[S.id,S]));return e.map(S=>f.get(S)).filter(S=>!!S)}getUserPromptsByIds(e,t={}){if(e.length===0)return[];let{orderBy:s="date_desc",limit:n,project:o}=t,i=s==="relevance",a=i?"":`ORDER BY up.created_at_epoch ${s==="date_asc"?"ASC":"DESC"}`,d=n?`LIMIT ${n}`:"",c=e.map(()=>"?").join(","),m=[...e],T=o?"AND s.project = ?":"";o&&m.push(o);let g=this.db.prepare(`
|
|
SELECT
|
|
up.*,
|
|
s.project,
|
|
s.memory_session_id
|
|
FROM user_prompts up
|
|
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
|
WHERE up.id IN (${c}) ${T}
|
|
${a}
|
|
${d}
|
|
`).all(...m);if(!i)return g;let f=new Map(g.map(S=>[S.id,S]));return e.map(S=>f.get(S)).filter(S=>!!S)}getTimelineAroundTimestamp(e,t=10,s=10,n){return this.getTimelineAroundObservation(null,e,t,s,n)}getTimelineAroundObservation(e,t,s=10,n=10,o){let i=o?"AND project = ?":"",a=o?[o]:[],d,c;if(e!==null){let p=`
|
|
SELECT id, created_at_epoch
|
|
FROM observations
|
|
WHERE id <= ? ${i}
|
|
ORDER BY id DESC
|
|
LIMIT ?
|
|
`,O=`
|
|
SELECT id, created_at_epoch
|
|
FROM observations
|
|
WHERE id >= ? ${i}
|
|
ORDER BY id ASC
|
|
LIMIT ?
|
|
`;try{let b=this.db.prepare(p).all(e,...a,s+1),l=this.db.prepare(O).all(e,...a,n+1);if(b.length===0&&l.length===0)return{observations:[],sessions:[],prompts:[]};d=b.length>0?b[b.length-1].created_at_epoch:t,c=l.length>0?l[l.length-1].created_at_epoch:t}catch(b){return b instanceof Error?u.error("DB","Error getting boundary observations",{project:o},b):u.error("DB","Error getting boundary observations with non-Error",{},new Error(String(b))),{observations:[],sessions:[],prompts:[]}}}else{let p=`
|
|
SELECT created_at_epoch
|
|
FROM observations
|
|
WHERE created_at_epoch <= ? ${i}
|
|
ORDER BY created_at_epoch DESC
|
|
LIMIT ?
|
|
`,O=`
|
|
SELECT created_at_epoch
|
|
FROM observations
|
|
WHERE created_at_epoch >= ? ${i}
|
|
ORDER BY created_at_epoch ASC
|
|
LIMIT ?
|
|
`;try{let b=this.db.prepare(p).all(t,...a,s),l=this.db.prepare(O).all(t,...a,n+1);if(b.length===0&&l.length===0)return{observations:[],sessions:[],prompts:[]};d=b.length>0?b[b.length-1].created_at_epoch:t,c=l.length>0?l[l.length-1].created_at_epoch:t}catch(b){return b instanceof Error?u.error("DB","Error getting boundary timestamps",{project:o},b):u.error("DB","Error getting boundary timestamps with non-Error",{},new Error(String(b))),{observations:[],sessions:[],prompts:[]}}}let m=`
|
|
SELECT *
|
|
FROM observations
|
|
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${i}
|
|
ORDER BY created_at_epoch ASC
|
|
`,T=`
|
|
SELECT *
|
|
FROM session_summaries
|
|
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${i}
|
|
ORDER BY created_at_epoch ASC
|
|
`,E=`
|
|
SELECT up.*, s.project, s.memory_session_id
|
|
FROM user_prompts up
|
|
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
|
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${i.replace("project","s.project")}
|
|
ORDER BY up.created_at_epoch ASC
|
|
`,g=this.db.prepare(m).all(d,c,...a),f=this.db.prepare(T).all(d,c,...a),S=this.db.prepare(E).all(d,c,...a);return{observations:g,sessions:f.map(p=>({id:p.id,memory_session_id:p.memory_session_id,project:p.project,request:p.request,completed:p.completed,next_steps:p.next_steps,created_at:p.created_at,created_at_epoch:p.created_at_epoch})),prompts:S.map(p=>({id:p.id,content_session_id:p.content_session_id,prompt_number:p.prompt_number,prompt_text:p.prompt_text,project:p.project,created_at:p.created_at,created_at_epoch:p.created_at_epoch}))}}getPromptById(e){return this.db.prepare(`
|
|
SELECT
|
|
p.id,
|
|
p.content_session_id,
|
|
p.prompt_number,
|
|
p.prompt_text,
|
|
s.project,
|
|
p.created_at,
|
|
p.created_at_epoch
|
|
FROM user_prompts p
|
|
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
|
WHERE p.id = ?
|
|
LIMIT 1
|
|
`).get(e)||null}getPromptsByIds(e){if(e.length===0)return[];let t=e.map(()=>"?").join(",");return this.db.prepare(`
|
|
SELECT
|
|
p.id,
|
|
p.content_session_id,
|
|
p.prompt_number,
|
|
p.prompt_text,
|
|
s.project,
|
|
p.created_at,
|
|
p.created_at_epoch
|
|
FROM user_prompts p
|
|
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
|
WHERE p.id IN (${t})
|
|
ORDER BY p.created_at_epoch DESC
|
|
`).all(...e)}getSessionSummaryById(e){return this.db.prepare(`
|
|
SELECT
|
|
id,
|
|
memory_session_id,
|
|
content_session_id,
|
|
project,
|
|
user_prompt,
|
|
request_summary,
|
|
learned_summary,
|
|
status,
|
|
created_at,
|
|
created_at_epoch
|
|
FROM sdk_sessions
|
|
WHERE id = ?
|
|
LIMIT 1
|
|
`).get(e)||null}getOrCreateManualSession(e){let t=`manual-${e}`,s=`manual-content-${e}`;if(this.db.prepare("SELECT memory_session_id FROM sdk_sessions WHERE memory_session_id = ?").get(t))return t;let o=new Date;return this.db.prepare(`
|
|
INSERT INTO sdk_sessions (memory_session_id, content_session_id, project, platform_source, started_at, started_at_epoch, status)
|
|
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
|
`).run(t,s,e,R,o.toISOString(),o.getTime()),u.info("SESSION","Created manual session",{memorySessionId:t,project:e}),t}close(){this.db.close()}importSdkSession(e){let t=this.db.prepare("SELECT id FROM sdk_sessions WHERE content_session_id = ?").get(e.content_session_id);return t?{imported:!1,id:t.id}:{imported:!0,id:this.db.prepare(`
|
|
INSERT INTO sdk_sessions (
|
|
content_session_id, memory_session_id, project, platform_source, user_prompt,
|
|
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
|
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e.content_session_id,e.memory_session_id,e.project,v(e.platform_source),e.user_prompt,e.started_at,e.started_at_epoch,e.completed_at,e.completed_at_epoch,e.status).lastInsertRowid}}importSessionSummary(e){let t=this.db.prepare("SELECT id FROM session_summaries WHERE memory_session_id = ?").get(e.memory_session_id);return t?{imported:!1,id:t.id}:{imported:!0,id:this.db.prepare(`
|
|
INSERT INTO session_summaries (
|
|
memory_session_id, project, request, investigated, learned,
|
|
completed, next_steps, files_read, files_edited, notes,
|
|
prompt_number, discovery_tokens, created_at, created_at_epoch
|
|
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e.memory_session_id,e.project,e.request,e.investigated,e.learned,e.completed,e.next_steps,e.files_read,e.files_edited,e.notes,e.prompt_number,e.discovery_tokens||0,e.created_at,e.created_at_epoch).lastInsertRowid}}importObservation(e){let t=this.db.prepare(`
|
|
SELECT id FROM observations
|
|
WHERE memory_session_id = ? AND title = ? AND created_at_epoch = ?
|
|
`).get(e.memory_session_id,e.title,e.created_at_epoch);return t?{imported:!1,id:t.id}:{imported:!0,id:this.db.prepare(`
|
|
INSERT INTO observations (
|
|
memory_session_id, project, text, type, title, subtitle,
|
|
facts, narrative, concepts, files_read, files_modified,
|
|
prompt_number, discovery_tokens, agent_type, agent_id,
|
|
created_at, created_at_epoch
|
|
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
|
`).run(e.memory_session_id,e.project,e.text,e.type,e.title,e.subtitle,e.facts,e.narrative,e.concepts,e.files_read,e.files_modified,e.prompt_number,e.discovery_tokens||0,e.agent_type??null,e.agent_id??null,e.created_at,e.created_at_epoch).lastInsertRowid}}rebuildObservationsFTSIndex(){this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name='observations_fts'").all().length>0&&this.db.run("INSERT INTO observations_fts(observations_fts) VALUES('rebuild')")}importUserPrompt(e){let t=this.db.prepare(`
|
|
SELECT id FROM user_prompts
|
|
WHERE content_session_id = ? AND prompt_number = ?
|
|
`).get(e.content_session_id,e.prompt_number);return t?{imported:!1,id:t.id}:{imported:!0,id:this.db.prepare(`
|
|
INSERT INTO user_prompts (
|
|
content_session_id, prompt_number, prompt_text,
|
|
created_at, created_at_epoch
|
|
) VALUES (?, ?, ?, ?, ?)
|
|
`).run(e.content_session_id,e.prompt_number,e.prompt_text,e.created_at,e.created_at_epoch).lastInsertRowid}}};var Ue=M(require("path"),1),xe=require("os");var C=require("fs"),w=require("path"),ie=require("os"),q=class{static DEFAULTS={CLAUDE_MEM_MODEL:"claude-haiku-4-5-20251001",CLAUDE_MEM_CONTEXT_OBSERVATIONS:"50",CLAUDE_MEM_WORKER_PORT:String(37700+(process.getuid?.()??77)%100),CLAUDE_MEM_WORKER_HOST:"127.0.0.1",CLAUDE_MEM_SKIP_TOOLS:"ListMcpResourcesTool,SlashCommand,Skill,TodoWrite,AskUserQuestion",CLAUDE_MEM_PROVIDER:"claude",CLAUDE_MEM_CLAUDE_AUTH_METHOD:"cli",CLAUDE_MEM_GEMINI_API_KEY:"",CLAUDE_MEM_GEMINI_MODEL:"gemini-2.5-flash-lite",CLAUDE_MEM_GEMINI_RATE_LIMITING_ENABLED:"true",CLAUDE_MEM_GEMINI_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_GEMINI_MAX_TOKENS:"100000",CLAUDE_MEM_OPENROUTER_API_KEY:"",CLAUDE_MEM_OPENROUTER_MODEL:"xiaomi/mimo-v2-flash:free",CLAUDE_MEM_OPENROUTER_SITE_URL:"",CLAUDE_MEM_OPENROUTER_APP_NAME:"claude-mem",CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_OPENROUTER_MAX_TOKENS:"100000",CLAUDE_MEM_DATA_DIR:(0,w.join)((0,ie.homedir)(),".claude-mem"),CLAUDE_MEM_LOG_LEVEL:"INFO",CLAUDE_MEM_PYTHON_VERSION:"3.13",CLAUDE_CODE_PATH:"",CLAUDE_MEM_MODE:"code",CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS:"false",CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS:"false",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT:"false",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT:"true",CLAUDE_MEM_CONTEXT_FULL_COUNT:"0",CLAUDE_MEM_CONTEXT_FULL_FIELD:"narrative",CLAUDE_MEM_CONTEXT_SESSION_COUNT:"10",CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY:"true",CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE:"false",CLAUDE_MEM_CONTEXT_SHOW_TERMINAL_OUTPUT:"true",CLAUDE_MEM_WELCOME_HINT_ENABLED:"true",CLAUDE_MEM_FOLDER_CLAUDEMD_ENABLED:"false",CLAUDE_MEM_FOLDER_USE_LOCAL_MD:"false",CLAUDE_MEM_TRANSCRIPTS_ENABLED:"true",CLAUDE_MEM_TRANSCRIPTS_CONFIG_PATH:(0,w.join)((0,ie.homedir)(),".claude-mem","transcript-watch.json"),CLAUDE_MEM_MAX_CONCURRENT_AGENTS:"2",CLAUDE_MEM_HOOK_FAIL_LOUD_THRESHOLD:"3",CLAUDE_MEM_EXCLUDED_PROJECTS:"",CLAUDE_MEM_FOLDER_MD_EXCLUDE:"[]",CLAUDE_MEM_SEMANTIC_INJECT:"false",CLAUDE_MEM_SEMANTIC_INJECT_LIMIT:"5",CLAUDE_MEM_TIER_ROUTING_ENABLED:"true",CLAUDE_MEM_TIER_SIMPLE_MODEL:"haiku",CLAUDE_MEM_TIER_SUMMARY_MODEL:"",CLAUDE_MEM_CHROMA_ENABLED:"true",CLAUDE_MEM_CHROMA_MODE:"local",CLAUDE_MEM_CHROMA_HOST:"127.0.0.1",CLAUDE_MEM_CHROMA_PORT:"8000",CLAUDE_MEM_CHROMA_SSL:"false",CLAUDE_MEM_CHROMA_API_KEY:"",CLAUDE_MEM_CHROMA_TENANT:"default_tenant",CLAUDE_MEM_CHROMA_DATABASE:"default_database",CLAUDE_MEM_TELEGRAM_ENABLED:"true",CLAUDE_MEM_TELEGRAM_BOT_TOKEN:"",CLAUDE_MEM_TELEGRAM_CHAT_ID:"",CLAUDE_MEM_TELEGRAM_TRIGGER_TYPES:"security_alert",CLAUDE_MEM_TELEGRAM_TRIGGER_CONCEPTS:""};static getAllDefaults(){return{...this.DEFAULTS}}static get(e){return process.env[e]??this.DEFAULTS[e]}static getInt(e){let t=this.get(e);return parseInt(t,10)}static getBool(e){let t=this.get(e);return t==="true"||t===!0}static applyEnvOverrides(e){let t={...e};for(let s of Object.keys(this.DEFAULTS))process.env[s]!==void 0&&(t[s]=process.env[s]);return t}static loadFromFile(e){try{if(!(0,C.existsSync)(e)){let i=this.getAllDefaults();try{let a=(0,w.dirname)(e);(0,C.existsSync)(a)||(0,C.mkdirSync)(a,{recursive:!0}),(0,C.writeFileSync)(e,JSON.stringify(i,null,2),"utf-8"),console.log("[SETTINGS] Created settings file with defaults:",e)}catch(a){console.warn("[SETTINGS] Failed to create settings file, using in-memory defaults:",e,a instanceof Error?a.message:String(a))}return this.applyEnvOverrides(i)}let t=(0,C.readFileSync)(e,"utf-8"),s=JSON.parse(t),n=s;if(s.env&&typeof s.env=="object"){n=s.env;try{(0,C.writeFileSync)(e,JSON.stringify(n,null,2),"utf-8"),console.log("[SETTINGS] Migrated settings file from nested to flat schema:",e)}catch(i){console.warn("[SETTINGS] Failed to auto-migrate settings file:",e,i instanceof Error?i.message:String(i))}}let o={...this.DEFAULTS};for(let i of Object.keys(this.DEFAULTS))n[i]!==void 0&&(o[i]=n[i]);return this.applyEnvOverrides(o)}catch(t){return console.warn("[SETTINGS] Failed to load settings, using defaults:",e,t instanceof Error?t.message:String(t)),this.applyEnvOverrides(this.getAllDefaults())}}};var k=require("fs"),V=require("path");var A=class r{static instance=null;activeMode=null;modesDir;constructor(){let e=Ce(),t=[(0,V.join)(e,"modes"),(0,V.join)(e,"..","plugin","modes")],s=t.find(n=>(0,k.existsSync)(n));this.modesDir=s||t[0]}static getInstance(){return r.instance||(r.instance=new r),r.instance}parseInheritance(e){let t=e.split("--");if(t.length===1)return{hasParent:!1,parentId:"",overrideId:""};if(t.length>2)throw new Error(`Invalid mode inheritance: ${e}. Only one level of inheritance supported (parent--override)`);return{hasParent:!0,parentId:t[0],overrideId:e}}isPlainObject(e){return e!==null&&typeof e=="object"&&!Array.isArray(e)}deepMerge(e,t){let s={...e};for(let n in t){let o=t[n],i=e[n];this.isPlainObject(o)&&this.isPlainObject(i)?s[n]=this.deepMerge(i,o):s[n]=o}return s}loadModeFile(e){let t=(0,V.join)(this.modesDir,`${e}.json`);if(!(0,k.existsSync)(t))throw new Error(`Mode file not found: ${t}`);let s=(0,k.readFileSync)(t,"utf-8");return JSON.parse(s)}loadMode(e){let t=this.parseInheritance(e);if(!t.hasParent)try{let d=this.loadModeFile(e);return this.activeMode=d,u.debug("SYSTEM",`Loaded mode: ${d.name} (${e})`,void 0,{types:d.observation_types.map(c=>c.id),concepts:d.observation_concepts.map(c=>c.id)}),d}catch(d){if(d instanceof Error?u.warn("WORKER",`Mode file not found: ${e}, falling back to 'code'`,{message:d.message}):u.warn("WORKER",`Mode file not found: ${e}, falling back to 'code'`,{error:String(d)}),e==="code")throw new Error("Critical: code.json mode file missing");return this.loadMode("code")}let{parentId:s,overrideId:n}=t,o;try{o=this.loadMode(s)}catch(d){d instanceof Error?u.warn("WORKER",`Parent mode '${s}' not found for ${e}, falling back to 'code'`,{message:d.message}):u.warn("WORKER",`Parent mode '${s}' not found for ${e}, falling back to 'code'`,{error:String(d)}),o=this.loadMode("code")}let i;try{i=this.loadModeFile(n),u.debug("SYSTEM",`Loaded override file: ${n} for parent ${s}`)}catch(d){return d instanceof Error?u.warn("WORKER",`Override file '${n}' not found, using parent mode '${s}' only`,{message:d.message}):u.warn("WORKER",`Override file '${n}' not found, using parent mode '${s}' only`,{error:String(d)}),this.activeMode=o,o}if(!i)return u.warn("SYSTEM",`Invalid override file: ${n}, using parent mode '${s}' only`),this.activeMode=o,o;let a=this.deepMerge(o,i);return this.activeMode=a,u.debug("SYSTEM",`Loaded mode with inheritance: ${a.name} (${e} = ${s} + ${n})`,void 0,{parent:s,override:n,types:a.observation_types.map(d=>d.id),concepts:a.observation_concepts.map(d=>d.id)}),a}getActiveMode(){if(!this.activeMode)throw new Error("No mode loaded. Call loadMode() first.");return this.activeMode}getObservationTypes(){return this.getActiveMode().observation_types}getObservationConcepts(){return this.getActiveMode().observation_concepts}getTypeIcon(e){return this.getObservationTypes().find(s=>s.id===e)?.emoji||"\u{1F4DD}"}getWorkEmoji(e){return this.getObservationTypes().find(s=>s.id===e)?.work_emoji||"\u{1F4DD}"}validateType(e){return this.getObservationTypes().some(t=>t.id===e)}getTypeLabel(e){return this.getObservationTypes().find(s=>s.id===e)?.label||e}};function ae(){let r=Ue.default.join((0,xe.homedir)(),".claude-mem","settings.json"),e=q.loadFromFile(r),t=A.getInstance().getActiveMode(),s=new Set(t.observation_types.map(o=>o.id)),n=new Set(t.observation_concepts.map(o=>o.id));return{totalObservationCount:parseInt(e.CLAUDE_MEM_CONTEXT_OBSERVATIONS,10),fullObservationCount:parseInt(e.CLAUDE_MEM_CONTEXT_FULL_COUNT,10),sessionCount:parseInt(e.CLAUDE_MEM_CONTEXT_SESSION_COUNT,10),showReadTokens:e.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS==="true",showWorkTokens:e.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS==="true",showSavingsAmount:e.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT==="true",showSavingsPercent:e.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT==="true",observationTypes:s,observationConcepts:n,fullObservationField:e.CLAUDE_MEM_CONTEXT_FULL_FIELD,showLastSummary:e.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY==="true",showLastMessage:e.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE==="true"}}var _={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"},we=4,de=1;function _e(r){let e=(r.title?.length||0)+(r.subtitle?.length||0)+(r.narrative?.length||0)+JSON.stringify(r.facts||[]).length;return Math.ceil(e/we)}function ue(r){let e=r.length,t=r.reduce((i,a)=>i+_e(a),0),s=r.reduce((i,a)=>i+(a.discovery_tokens||0),0),n=s-t,o=s>0?Math.round(n/s*100):0;return{totalObservations:e,totalReadTokens:t,totalDiscoveryTokens:s,savings:n,savingsPercent:o}}function Bt(r){return A.getInstance().getWorkEmoji(r)}function F(r,e){let t=_e(r),s=r.discovery_tokens||0,n=Bt(r.type),o=s>0?`${n} ${s.toLocaleString()}`:"-";return{readTokens:t,discoveryTokens:s,discoveryDisplay:o,workEmoji:n}}function Y(r){return r.showReadTokens||r.showWorkTokens||r.showSavingsAmount||r.showSavingsPercent}var Fe=M(require("path"),1),K=require("fs");var Wt=["private","claude-mem-context","system_instruction","system-instruction","persisted-output","system-reminder"],Ks=new RegExp(`<(${Wt.join("|")})\\b[^>]*>[\\s\\S]*?</\\1>`,"g"),ke=/<system-reminder>[\s\S]*?<\/system-reminder>/g;var qt=["task-notification"],Js=new RegExp(`^\\s*<(${qt.join("|")})\\b[^>]*>(?:(?!<\\1\\b|</\\1\\b)[\\s\\S])*</\\1>\\s*$`),Qs=256*1024;function me(r,e,t){let s=Array.from(t.observationTypes),n=s.map(()=>"?").join(","),o=Array.from(t.observationConcepts),i=o.map(()=>"?").join(",");return r.db.prepare(`
|
|
SELECT
|
|
o.id,
|
|
o.memory_session_id,
|
|
COALESCE(s.platform_source, 'claude') as platform_source,
|
|
o.type,
|
|
o.title,
|
|
o.subtitle,
|
|
o.narrative,
|
|
o.facts,
|
|
o.concepts,
|
|
o.files_read,
|
|
o.files_modified,
|
|
o.discovery_tokens,
|
|
o.created_at,
|
|
o.created_at_epoch
|
|
FROM observations o
|
|
LEFT JOIN sdk_sessions s ON o.memory_session_id = s.memory_session_id
|
|
WHERE (o.project = ? OR o.merged_into_project = ?)
|
|
AND type IN (${n})
|
|
AND EXISTS (
|
|
SELECT 1 FROM json_each(o.concepts)
|
|
WHERE value IN (${i})
|
|
)
|
|
ORDER BY o.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e,e,...s,...o,t.totalObservationCount)}function ce(r,e,t){return r.db.prepare(`
|
|
SELECT
|
|
ss.id,
|
|
ss.memory_session_id,
|
|
COALESCE(s.platform_source, 'claude') as platform_source,
|
|
ss.request,
|
|
ss.investigated,
|
|
ss.learned,
|
|
ss.completed,
|
|
ss.next_steps,
|
|
ss.created_at,
|
|
ss.created_at_epoch
|
|
FROM session_summaries ss
|
|
LEFT JOIN sdk_sessions s ON ss.memory_session_id = s.memory_session_id
|
|
WHERE (ss.project = ? OR ss.merged_into_project = ?)
|
|
ORDER BY ss.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(e,e,t.sessionCount+de)}function $e(r,e,t){let s=Array.from(t.observationTypes),n=s.map(()=>"?").join(","),o=Array.from(t.observationConcepts),i=o.map(()=>"?").join(","),a=e.map(()=>"?").join(",");return r.db.prepare(`
|
|
SELECT
|
|
o.id,
|
|
o.memory_session_id,
|
|
COALESCE(s.platform_source, 'claude') as platform_source,
|
|
o.type,
|
|
o.title,
|
|
o.subtitle,
|
|
o.narrative,
|
|
o.facts,
|
|
o.concepts,
|
|
o.files_read,
|
|
o.files_modified,
|
|
o.discovery_tokens,
|
|
o.created_at,
|
|
o.created_at_epoch,
|
|
o.project
|
|
FROM observations o
|
|
LEFT JOIN sdk_sessions s ON o.memory_session_id = s.memory_session_id
|
|
WHERE (o.project IN (${a})
|
|
OR o.merged_into_project IN (${a}))
|
|
AND type IN (${n})
|
|
AND EXISTS (
|
|
SELECT 1 FROM json_each(o.concepts)
|
|
WHERE value IN (${i})
|
|
)
|
|
ORDER BY o.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(...e,...e,...s,...o,t.totalObservationCount)}function Pe(r,e,t){let s=e.map(()=>"?").join(",");return r.db.prepare(`
|
|
SELECT
|
|
ss.id,
|
|
ss.memory_session_id,
|
|
COALESCE(s.platform_source, 'claude') as platform_source,
|
|
ss.request,
|
|
ss.investigated,
|
|
ss.learned,
|
|
ss.completed,
|
|
ss.next_steps,
|
|
ss.created_at,
|
|
ss.created_at_epoch,
|
|
ss.project
|
|
FROM session_summaries ss
|
|
LEFT JOIN sdk_sessions s ON ss.memory_session_id = s.memory_session_id
|
|
WHERE (ss.project IN (${s})
|
|
OR ss.merged_into_project IN (${s}))
|
|
ORDER BY ss.created_at_epoch DESC
|
|
LIMIT ?
|
|
`).all(...e,...e,t.sessionCount+de)}function Vt(r){return r.replace(/\//g,"-")}function Yt(r){if(!r.includes('"type":"assistant"'))return null;let e=JSON.parse(r);if(e.type==="assistant"&&e.message?.content&&Array.isArray(e.message.content)){let t="";for(let s of e.message.content)s.type==="text"&&(t+=s.text);if(t=t.replace(ke,"").trim(),t)return t}return null}function Kt(r){for(let e=r.length-1;e>=0;e--)try{let t=Yt(r[e]);if(t)return t}catch(t){t instanceof Error?u.debug("WORKER","Skipping malformed transcript line",{lineIndex:e},t):u.debug("WORKER","Skipping malformed transcript line",{lineIndex:e,error:String(t)});continue}return""}function Jt(r){try{if(!(0,K.existsSync)(r))return{userMessage:"",assistantMessage:""};let e=(0,K.readFileSync)(r,"utf-8").trim();if(!e)return{userMessage:"",assistantMessage:""};let t=e.split(`
|
|
`).filter(n=>n.trim());return{userMessage:"",assistantMessage:Kt(t)}}catch(e){return e instanceof Error?u.failure("WORKER","Failed to extract prior messages from transcript",{transcriptPath:r},e):u.warn("WORKER","Failed to extract prior messages from transcript",{transcriptPath:r,error:String(e)}),{userMessage:"",assistantMessage:""}}}function le(r,e,t,s){if(!e.showLastMessage||r.length===0)return{userMessage:"",assistantMessage:""};let n=r.find(d=>d.memory_session_id!==t);if(!n)return{userMessage:"",assistantMessage:""};let o=n.memory_session_id,i=Vt(s),a=Fe.default.join(D,"projects",i,`${o}.jsonl`);return Jt(a)}function He(r,e){let t=e[0]?.id;return r.map((s,n)=>{let o=n===0?null:e[n+1];return{...s,displayEpoch:o?o.created_at_epoch:s.created_at_epoch,displayTime:o?o.created_at:s.created_at,shouldShowLink:s.id!==t}})}function pe(r,e){let t=[...r.map(s=>({type:"observation",data:s})),...e.map(s=>({type:"summary",data:s}))];return t.sort((s,n)=>{let o=s.type==="observation"?s.data.created_at_epoch:s.data.displayEpoch,i=n.type==="observation"?n.data.created_at_epoch:n.data.displayEpoch;return o-i}),t}function Ge(r,e){return new Set(r.slice(0,e).map(t=>t.id))}function Xe(){let r=new Date,e=r.toLocaleDateString("en-CA"),t=r.toLocaleTimeString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0}).toLowerCase().replace(" ",""),s=r.toLocaleTimeString("en-US",{timeZoneName:"short"}).split(" ").pop();return`${e} ${t} ${s}`}function je(r){return[`# [${r}] recent context, ${Xe()}`,""]}function Be(){return[`Legend: \u{1F3AF}session ${A.getInstance().getActiveMode().observation_types.map(t=>`${t.emoji}${t.id}`).join(" ")}`,"Format: ID TIME TYPE TITLE","Fetch details: get_observations([IDs]) | Search: mem-search skill",""]}function We(){return[]}function qe(){return[]}function Ve(r,e){let t=[],s=[`${r.totalObservations} obs (${r.totalReadTokens.toLocaleString()}t read)`,`${r.totalDiscoveryTokens.toLocaleString()}t work`];return r.totalDiscoveryTokens>0&&(e.showSavingsAmount||e.showSavingsPercent)&&(e.showSavingsPercent?s.push(`${r.savingsPercent}% savings`):e.showSavingsAmount&&s.push(`${r.savings.toLocaleString()}t saved`)),t.push(`Stats: ${s.join(" | ")}`),t.push(""),t}function Ye(r){return[`### ${r}`]}function Ke(r){return r.toLowerCase().replace(" am","a").replace(" pm","p")}function Je(r,e,t){let s=r.title||"Untitled",n=A.getInstance().getTypeIcon(r.type),o=e?Ke(e):'"';return`${r.id} ${o} ${n} ${s}`}function Qe(r,e,t,s){let n=[],o=r.title||"Untitled",i=A.getInstance().getTypeIcon(r.type),a=e?Ke(e):'"',{readTokens:d,discoveryDisplay:c}=F(r,s);n.push(`**${r.id}** ${a} ${i} **${o}**`),t&&n.push(t);let m=[];return s.showReadTokens&&m.push(`~${d}t`),s.showWorkTokens&&m.push(c),m.length>0&&n.push(m.join(" ")),n.push(""),n}function ze(r,e){return[`S${r.id} ${r.request||"Session started"} (${e})`]}function $(r,e){return e?[`**${r}**: ${e}`,""]:[]}function Ze(r){return r.assistantMessage?["","---","","**Previously**","",`A: ${r.assistantMessage}`,""]:[]}function et(r,e){return["",`Access ${Math.round(r/1e3)}k tokens of past work via get_observations([IDs]) or mem-search skill.`]}function tt(r){return`# [${r}] recent context, ${Xe()}
|
|
|
|
No previous sessions found.`}function st(){let r=new Date,e=r.toLocaleDateString("en-CA"),t=r.toLocaleTimeString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0}).toLowerCase().replace(" ",""),s=r.toLocaleTimeString("en-US",{timeZoneName:"short"}).split(" ").pop();return`${e} ${t} ${s}`}function rt(r){return["",`${_.bright}${_.cyan}[${r}] recent context, ${st()}${_.reset}`,`${_.gray}${"\u2500".repeat(60)}${_.reset}`,""]}function nt(){let e=A.getInstance().getActiveMode().observation_types.map(t=>`${t.emoji} ${t.id}`).join(" | ");return[`${_.dim}Legend: session-request | ${e}${_.reset}`,""]}function ot(){return[`${_.bright}Column Key${_.reset}`,`${_.dim} Read: Tokens to read this observation (cost to learn it now)${_.reset}`,`${_.dim} Work: Tokens spent on work that produced this record ( research, building, deciding)${_.reset}`,""]}function it(){return[`${_.dim}Context Index: This semantic index (titles, types, files, tokens) is usually sufficient to understand past work.${_.reset}`,"",`${_.dim}When you need implementation details, rationale, or debugging context:${_.reset}`,`${_.dim} - Fetch by ID: get_observations([IDs]) for observations visible in this index${_.reset}`,`${_.dim} - Search history: Use the mem-search skill for past decisions, bugs, and deeper research${_.reset}`,`${_.dim} - Trust this index over re-reading code for past decisions and learnings${_.reset}`,""]}function at(r,e){let t=[];if(t.push(`${_.bright}${_.cyan}Context Economics${_.reset}`),t.push(`${_.dim} Loading: ${r.totalObservations} observations (${r.totalReadTokens.toLocaleString()} tokens to read)${_.reset}`),t.push(`${_.dim} Work investment: ${r.totalDiscoveryTokens.toLocaleString()} tokens spent on research, building, and decisions${_.reset}`),r.totalDiscoveryTokens>0&&(e.showSavingsAmount||e.showSavingsPercent)){let s=" Your savings: ";e.showSavingsAmount&&e.showSavingsPercent?s+=`${r.savings.toLocaleString()} tokens (${r.savingsPercent}% reduction from reuse)`:e.showSavingsAmount?s+=`${r.savings.toLocaleString()} tokens`:s+=`${r.savingsPercent}% reduction from reuse`,t.push(`${_.green}${s}${_.reset}`)}return t.push(""),t}function dt(r){return[`${_.bright}${_.cyan}${r}${_.reset}`,""]}function _t(r){return[`${_.dim}${r}${_.reset}`]}function ut(r,e,t,s){let n=r.title||"Untitled",o=A.getInstance().getTypeIcon(r.type),{readTokens:i,discoveryTokens:a,workEmoji:d}=F(r,s),c=t?`${_.dim}${e}${_.reset}`:" ".repeat(e.length),m=s.showReadTokens&&i>0?`${_.dim}(~${i}t)${_.reset}`:"",T=s.showWorkTokens&&a>0?`${_.dim}(${d} ${a.toLocaleString()}t)${_.reset}`:"";return` ${_.dim}#${r.id}${_.reset} ${c} ${o} ${n} ${m} ${T}`}function mt(r,e,t,s,n){let o=[],i=r.title||"Untitled",a=A.getInstance().getTypeIcon(r.type),{readTokens:d,discoveryTokens:c,workEmoji:m}=F(r,n),T=t?`${_.dim}${e}${_.reset}`:" ".repeat(e.length),E=n.showReadTokens&&d>0?`${_.dim}(~${d}t)${_.reset}`:"",g=n.showWorkTokens&&c>0?`${_.dim}(${m} ${c.toLocaleString()}t)${_.reset}`:"";return o.push(` ${_.dim}#${r.id}${_.reset} ${T} ${a} ${_.bright}${i}${_.reset}`),s&&o.push(` ${_.dim}${s}${_.reset}`),(E||g)&&o.push(` ${E} ${g}`),o.push(""),o}function ct(r,e){let t=`${r.request||"Session started"} (${e})`;return[`${_.yellow}#S${r.id}${_.reset} ${t}`,""]}function P(r,e,t){return e?[`${t}${r}:${_.reset} ${e}`,""]:[]}function lt(r){return r.assistantMessage?["","---","",`${_.bright}${_.magenta}Previously${_.reset}`,"",`${_.dim}A: ${r.assistantMessage}${_.reset}`,""]:[]}function pt(r,e){let t=Math.round(r/1e3);return["",`${_.dim}Access ${t}k tokens of past research & decisions for just ${e.toLocaleString()}t. Use the claude-mem skill to access memories by ID.${_.reset}`]}function Et(r){return`
|
|
${_.bright}${_.cyan}[${r}] recent context, ${st()}${_.reset}
|
|
${_.gray}${"\u2500".repeat(60)}${_.reset}
|
|
|
|
${_.dim}No previous sessions found for this project yet.${_.reset}
|
|
`}function gt(r,e,t,s){let n=[];return s?n.push(...rt(r)):n.push(...je(r)),s?n.push(...nt()):n.push(...Be()),s?n.push(...ot()):n.push(...We()),s?n.push(...it()):n.push(...qe()),Y(t)&&(s?n.push(...at(e,t)):n.push(...Ve(e,t))),n}var Ee=M(require("path"),1);function z(r){if(!r)return[];try{let e=JSON.parse(r);return Array.isArray(e)?e:[]}catch(e){return u.debug("PARSER","Failed to parse JSON array, using empty fallback",{preview:r?.substring(0,50)},e instanceof Error?e:new Error(String(e))),[]}}function ge(r){return new Date(r).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function Te(r){return new Date(r).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function ft(r){return new Date(r).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function Tt(r,e){return Ee.default.isAbsolute(r)?Ee.default.relative(e,r):r}function St(r,e,t){let s=z(r);if(s.length>0)return Tt(s[0],e);if(t){let n=z(t);if(n.length>0)return Tt(n[0],e)}return"General"}function Qt(r){let e=new Map;for(let s of r){let n=s.type==="observation"?s.data.created_at:s.data.displayTime,o=ft(n);e.has(o)||e.set(o,[]),e.get(o).push(s)}let t=Array.from(e.entries()).sort((s,n)=>{let o=new Date(s[0]).getTime(),i=new Date(n[0]).getTime();return o-i});return new Map(t)}function bt(r,e){return e.fullObservationField==="narrative"?r.narrative:r.facts?z(r.facts).join(`
|
|
`):null}function zt(r,e,t,s){let n=[];n.push(...Ye(r));let o="";for(let i of e)if(i.type==="summary"){let a=i.data,d=ge(a.displayTime);n.push(...ze(a,d))}else{let a=i.data,d=Te(a.created_at),m=d!==o?d:"";if(o=d,t.has(a.id)){let E=bt(a,s);n.push(...Qe(a,m,E,s))}else n.push(Je(a,m,s))}return n}function Zt(r,e,t,s,n){let o=[];o.push(...dt(r));let i=null,a="";for(let d of e)if(d.type==="summary"){i=null,a="";let c=d.data,m=ge(c.displayTime);o.push(...ct(c,m))}else{let c=d.data,m=St(c.files_modified,n,c.files_read),T=Te(c.created_at),E=T!==a;a=T;let g=t.has(c.id);if(m!==i&&(o.push(..._t(m)),i=m),g){let f=bt(c,s);o.push(...mt(c,T,E,f,s))}else o.push(ut(c,T,E,s))}return o.push(""),o}function es(r,e,t,s,n,o){return o?Zt(r,e,t,s,n):zt(r,e,t,s)}function ht(r,e,t,s,n){let o=[],i=Qt(r);for(let[a,d]of i)o.push(...es(a,d,e,t,s,n));return o}function Ot(r,e,t){return!(!r.showLastSummary||!e||!!!(e.investigated||e.learned||e.completed||e.next_steps)||t&&e.created_at_epoch<=t.created_at_epoch)}function Rt(r,e){let t=[];return e?(t.push(...P("Investigated",r.investigated,_.blue)),t.push(...P("Learned",r.learned,_.yellow)),t.push(...P("Completed",r.completed,_.green)),t.push(...P("Next Steps",r.next_steps,_.magenta))):(t.push(...$("Investigated",r.investigated)),t.push(...$("Learned",r.learned)),t.push(...$("Completed",r.completed)),t.push(...$("Next Steps",r.next_steps))),t}function At(r,e){return e?lt(r):Ze(r)}function Nt(r,e,t){return!Y(e)||r.totalDiscoveryTokens<=0||r.savings<=0?[]:t?pt(r.totalDiscoveryTokens,r.totalReadTokens):et(r.totalDiscoveryTokens,r.totalReadTokens)}var ts=Ct.default.join((0,It.homedir)(),".claude","plugins","marketplaces","thedotmack","plugin",".install-version");function ss(){try{return new W}catch(r){if(r instanceof Error&&r.code==="ERR_DLOPEN_FAILED"){try{(0,Lt.unlinkSync)(ts)}catch(e){e instanceof Error?u.debug("WORKER","Marker file cleanup failed (may not exist)",{},e):u.debug("WORKER","Marker file cleanup failed (may not exist)",{error:String(e)})}return u.error("WORKER","Native module rebuild needed - restart Claude Code to auto-fix"),null}throw r}}function rs(r,e){return e?Et(r):tt(r)}function ns(r,e,t,s,n,o,i){let a=[],d=ue(e);a.push(...gt(r,d,s,i));let c=t.slice(0,s.sessionCount),m=He(c,t),T=pe(e,m),E=Ge(e,s.fullObservationCount);a.push(...ht(T,E,s,n,i));let g=t[0],f=e[0];Ot(s,g,f)&&a.push(...Rt(g,i));let S=le(e,s,o,n);return a.push(...At(S,i)),a.push(...Nt(d,s,i)),a.join(`
|
|
`).trimEnd()}async function fe(r,e=!1){let t=ae(),s=r?.cwd??process.cwd(),n=re(s),o=r?.projects?.length?r.projects:n.allProjects,i=o[o.length-1]??n.primary;r?.full&&(t.totalObservationCount=999999,t.sessionCount=999999);let a=ss();if(!a)return"";try{let d=o.length>1?$e(a,o,t):me(a,i,t),c=o.length>1?Pe(a,o,t):ce(a,i,t);return d.length===0&&c.length===0?rs(i,e):ns(i,d,c,t,s,r?.session_id,e)}finally{a.close()}}0&&(module.exports={generateContext});
|