Fix 30+ root-cause bugs across 10 triage phases (#1214)

* MAESTRO: fix ChromaDB core issues — Python pinning, Windows paths, disable toggle, metadata sanitization, transport errors

- Add --python version pinning to uvx args in both local and remote mode (fixes #1196, #1206, #1208)
- Convert backslash paths to forward slashes for --data-dir on Windows (fixes #1199)
- Add CLAUDE_MEM_CHROMA_ENABLED setting for SQLite-only fallback mode (fixes #707)
- Sanitize metadata in addDocuments() to filter null/undefined/empty values (fixes #1183, #1188)
- Wrap callTool() in try/catch for transport errors with auto-reconnect (fixes #1162)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix data integrity — content-hash deduplication, project name collision, empty project guard, stuck isProcessing

- Add SHA-256 content-hash deduplication to observations INSERT (store.ts, transactions.ts, SessionStore.ts)
- Add content_hash column via migration 22 with backfill and index
- Fix project name collision: getCurrentProjectName() now returns parent/basename
- Guard against empty project string with cwd-derived fallback
- Fix stuck isProcessing: hasAnyPendingWork() resets processing messages older than 5 minutes
- Add 12 new tests covering all four fixes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix hook lifecycle — stderr suppression, output isolation, conversation pollution prevention

- Suppress process.stderr.write in hookCommand() to prevent Claude Code showing diagnostic
  output as error UI (#1181). Restores stderr in finally block for worker-continues case.
- Convert console.error() to logger.warn()/error() in hook-command.ts and handlers/index.ts
  so all diagnostics route to log file instead of stderr.
- Verified all 7 handlers return suppressOutput: true (prevents conversation pollution #598, #784).
- Verified session-complete is a recognized event type (fixes #984).
- Verified unknown event types return no-op handler with exit 0 (graceful degradation).
- Added 10 new tests in tests/hook-lifecycle.test.ts covering event dispatch, adapter defaults,
  stderr suppression, and standard response constants.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix worker lifecycle — restart loop coordination, stale transport retry, ENOENT shutdown race

- Add PID file mtime guard to prevent concurrent restart storms (#1145):
  isPidFileRecent() + touchPidFile() coordinate across sessions
- Add transparent retry in ChromaMcpManager.callTool() on transport
  error — reconnects and retries once instead of failing (#1131)
- Wrap getInstalledPluginVersion() with ENOENT/EBUSY handling (#1042)
- Verified ChromaMcpManager.stop() already called on all shutdown paths

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix Windows platform support — uvx.cmd spawn, PowerShell $_ elimination, windowsHide, FTS5 fallback

- Route uvx spawn through cmd.exe /c on Windows since MCP SDK lacks shell:true (#1190, #1192, #1199)
- Replace all PowerShell Where-Object {$_} pipelines with WQL -Filter server-side filtering (#1024, #1062)
- Add windowsHide: true to all exec/spawn calls missing it to prevent console popups (#1048)
- Add FTS5 runtime probe with graceful fallback when unavailable on Windows (#791)
- Guard FTS5 table creation in migrations, SessionSearch, and SessionStore with try/catch

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix skills/ distribution — build-time verification and regression tests (#1187)

Add post-build verification in build-hooks.js that fails if critical
distribution files (skills, hooks, plugin manifest) are missing. Add
10 regression tests covering skill file presence, YAML frontmatter,
hooks.json integrity, and package.json files field.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix MigrationRunner schema initialization (#979) — version conflict between parallel migration systems

Root cause: old DatabaseManager migrations 1-7 shared schema_versions table with
MigrationRunner's 4-22, causing version number collisions (5=drop tables vs add column,
6=FTS5 vs prompt tracking, 7=discovery_tokens vs remove UNIQUE).  initializeSchema()
was gated behind maxApplied===0, so core tables were never created when old versions
were present.

Fixes:
- initializeSchema() always creates core tables via CREATE TABLE IF NOT EXISTS
- Migrations 5-7 check actual DB state (columns/constraints) not just version tracking
- Crash-safe temp table rebuilds (DROP IF EXISTS _new before CREATE)
- Added missing migration 21 (ON UPDATE CASCADE) to MigrationRunner
- Added ON UPDATE CASCADE to FK definitions in initializeSchema()
- All changes applied to both runner.ts and SessionStore.ts

Tests: 13 new tests in migration-runner.test.ts covering fresh DB, idempotency,
version conflicts, crash recovery, FK constraints, and data integrity.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix 21 test failures — stale mocks, outdated assertions, missing OpenClaw guards

Server tests (12): Added missing workerPath and getAiStatus to ServerOptions
mocks after interface expansion. ChromaSync tests (3): Updated to verify
transport cleanup in ChromaMcpManager after architecture refactor. OpenClaw (2):
Added memory_ tool skipping and response truncation to prevent recursive loops
and oversized payloads. MarkdownFormatter (2): Updated assertions to match
current output. SettingsDefaultsManager (1): Used correct default key for
getBool test. Logger standards (1): Excluded CLI transcript command from
background service check.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix Codex CLI compatibility (#744) — session_id fallbacks, unknown platform tolerance, undefined guard

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix Cursor IDE integration (#838, #1049) — adapter field fallbacks, tolerant session-init validation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix /api/logs OOM (#1203) — tail-read replaces full-file readFileSync

Replace readFileSync (loads entire file into memory) with readLastLines()
that reads only from the end of the file in expanding chunks (64KB → 10MB cap).
Prevents OOM on large log files while preserving the same API response shape.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix Settings CORS error (#1029) — explicit methods and allowedHeaders in CORS config

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: add session custom_title for agent attribution (#1213) — migration 23, endpoint + store support

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: prevent CLAUDE.md/AGENTS.md writes inside .git/ directories (#1165)

Add .git path guard to all 4 write sites to prevent ref corruption when
paths resolve inside .git internals.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix plugin disabled state not respected (#781) — early exit check in all hook entry points

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix UserPromptSubmit context re-injection on every turn (#1079) — contextInjected session flag

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* MAESTRO: fix stale AbortController queue stall (#1099) — lastGeneratorActivity tracking + 30s timeout

Three-layer fix:
1. Added lastGeneratorActivity timestamp to ActiveSession, updated by
   processAgentResponse (all agents), getMessageIterator (queue yields),
   and startGeneratorWithProvider (generator launch)
2. Added stale generator detection in ensureGeneratorRunning — if no
   activity for >30s, aborts stale controller, resets state, restarts
3. Added AbortSignal.timeout(30000) in deleteSession to prevent
   indefinite hang when awaiting a stuck generator promise

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Alex Newman
2026-02-23 19:34:35 -05:00
committed by GitHub
parent d9a30cc7d4
commit c6f932988a
62 changed files with 3639 additions and 793 deletions
+1 -1
View File
@@ -6,7 +6,7 @@ export const claudeCodeAdapter: PlatformAdapter = {
normalizeInput(raw) {
const r = (raw ?? {}) as any;
return {
sessionId: r.session_id,
sessionId: r.session_id ?? r.id ?? r.sessionId,
cwd: r.cwd ?? process.cwd(),
prompt: r.prompt,
toolName: r.tool_name,
+8 -3
View File
@@ -3,15 +3,20 @@ import type { PlatformAdapter, NormalizedHookInput, HookResult } from '../types.
// Maps Cursor stdin format - field names differ from Claude Code
// Cursor uses: conversation_id, workspace_roots[], result_json, command/output
// Handle undefined input gracefully for hooks that don't receive stdin
//
// Cursor payload variations (#838, #1049):
// Session ID: conversation_id, generation_id, or id
// Prompt: prompt, query, input, or message (varies by Cursor version/hook type)
// CWD: workspace_roots[0] or cwd
export const cursorAdapter: PlatformAdapter = {
normalizeInput(raw) {
const r = (raw ?? {}) as any;
// Cursor-specific: shell commands come as command/output instead of tool_name/input/response
const isShellCommand = !!r.command && !r.tool_name;
return {
sessionId: r.conversation_id || r.generation_id, // conversation_id preferred
cwd: r.workspace_roots?.[0] ?? process.cwd(), // First workspace root
prompt: r.prompt,
sessionId: r.conversation_id || r.generation_id || r.id,
cwd: r.workspace_roots?.[0] ?? r.cwd ?? process.cwd(),
prompt: r.prompt ?? r.query ?? r.input ?? r.message,
toolName: isShellCommand ? 'Bash' : r.tool_name,
toolInput: isShellCommand ? { command: r.command } : r.tool_input,
toolResponse: isShellCommand ? { output: r.output } : r.result_json, // result_json not tool_response
+2 -1
View File
@@ -8,7 +8,8 @@ export function getPlatformAdapter(platform: string): PlatformAdapter {
case 'claude-code': return claudeCodeAdapter;
case 'cursor': return cursorAdapter;
case 'raw': return rawAdapter;
default: throw new Error(`Unknown platform: ${platform}`);
// Codex CLI and other compatible platforms use the raw adapter (accepts both camelCase and snake_case fields)
default: return rawAdapter;
}
}
+5
View File
@@ -264,6 +264,11 @@ function formatObservationsForClaudeMd(observations: ObservationRow[], folderPat
* Only writes to folders that exist — never creates directories.
*/
function writeClaudeMdToFolder(folderPath: string, newContent: string): void {
const resolvedPath = path.resolve(folderPath);
// Never write inside .git directories — corrupts refs (#1165)
if (resolvedPath.includes('/.git/') || resolvedPath.includes('\\.git\\') || resolvedPath.endsWith('/.git') || resolvedPath.endsWith('\\.git')) return;
const claudeMdPath = path.join(folderPath, 'CLAUDE.md');
const tempFile = `${claudeMdPath}.tmp`;
+2 -1
View File
@@ -6,6 +6,7 @@
import type { EventHandler } from '../types.js';
import { HOOK_EXIT_CODES } from '../../shared/hook-constants.js';
import { logger } from '../../utils/logger.js';
import { contextHandler } from './context.js';
import { sessionInitHandler } from './session-init.js';
import { observationHandler } from './observation.js';
@@ -46,7 +47,7 @@ const handlers: Record<EventType, EventHandler> = {
export function getEventHandler(eventType: string): EventHandler {
const handler = handlers[eventType as EventType];
if (!handler) {
console.error(`[claude-mem] Unknown event type: ${eventType}, returning no-op`);
logger.warn('HOOK', `Unknown event type: ${eventType}, returning no-op`);
return {
async execute() {
return { continue: true, suppressOutput: true, exitCode: HOOK_EXIT_CODES.SUCCESS };
+18 -1
View File
@@ -24,6 +24,12 @@ export const sessionInitHandler: EventHandler = {
const { sessionId, cwd, prompt: rawPrompt } = input;
// Guard: Codex CLI and other platforms may not provide a session_id (#744)
if (!sessionId) {
logger.warn('HOOK', 'session-init: No sessionId provided, skipping (Codex CLI or unknown platform)');
return { continue: true, suppressOutput: true, exitCode: HOOK_EXIT_CODES.SUCCESS };
}
// Check if project is excluded from tracking
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
if (cwd && isProjectExcluded(cwd, settings.CLAUDE_MEM_EXCLUDED_PROJECTS)) {
@@ -63,11 +69,12 @@ export const sessionInitHandler: EventHandler = {
promptNumber: number;
skipped?: boolean;
reason?: string;
contextInjected?: boolean;
};
const sessionDbId = initResult.sessionDbId;
const promptNumber = initResult.promptNumber;
logger.debug('HOOK', 'session-init: Received from /api/sessions/init', { sessionDbId, promptNumber, skipped: initResult.skipped });
logger.debug('HOOK', 'session-init: Received from /api/sessions/init', { sessionDbId, promptNumber, skipped: initResult.skipped, contextInjected: initResult.contextInjected });
// Debug-level alignment log for detailed tracing
logger.debug('HOOK', `[ALIGNMENT] Hook Entry | contentSessionId=${sessionId} | prompt#=${promptNumber} | sessionDbId=${sessionDbId}`);
@@ -80,6 +87,16 @@ export const sessionInitHandler: EventHandler = {
return { continue: true, suppressOutput: true };
}
// Skip SDK agent re-initialization if context was already injected for this session (#1079)
// The prompt was already saved to the database by /api/sessions/init above —
// no need to re-start the SDK agent on every turn
if (initResult.contextInjected) {
logger.info('HOOK', `INIT_COMPLETE | sessionDbId=${sessionDbId} | promptNumber=${promptNumber} | skipped_agent_init=true | reason=context_already_injected`, {
sessionId: sessionDbId
});
return { continue: true, suppressOutput: true };
}
// Only initialize SDK agent for Claude Code (not Cursor)
// Cursor doesn't use the SDK agent - it only needs session/observation storage
if (input.platform !== 'cursor' && sessionDbId) {
+14 -3
View File
@@ -2,6 +2,7 @@ import { readJsonFromStdin } from './stdin-reader.js';
import { getPlatformAdapter } from './adapters/index.js';
import { getEventHandler } from './handlers/index.js';
import { HOOK_EXIT_CODES } from '../shared/hook-constants.js';
import { logger } from '../utils/logger.js';
export interface HookCommandOptions {
/** If true, don't call process.exit() - let caller handle process lifecycle */
@@ -65,6 +66,12 @@ export function isWorkerUnavailableError(error: unknown): boolean {
}
export async function hookCommand(platform: string, event: string, options: HookCommandOptions = {}): Promise<number> {
// Suppress stderr in hook context — Claude Code shows stderr as error UI (#1181)
// Exit 1: stderr shown to user. Exit 2: stderr fed to Claude for processing.
// All diagnostics go to log file via logger; stderr must stay clean.
const originalStderrWrite = process.stderr.write.bind(process.stderr);
process.stderr.write = (() => true) as typeof process.stderr.write;
try {
const adapter = getPlatformAdapter(platform);
const handler = getEventHandler(event);
@@ -84,18 +91,22 @@ export async function hookCommand(platform: string, event: string, options: Hook
} catch (error) {
if (isWorkerUnavailableError(error)) {
// Worker unavailable — degrade gracefully, don't block the user
console.error(`[claude-mem] Worker unavailable, skipping hook: ${error instanceof Error ? error.message : error}`);
// Log to file instead of stderr (#1181)
logger.warn('HOOK', `Worker unavailable, skipping hook: ${error instanceof Error ? error.message : error}`);
if (!options.skipExit) {
process.exit(HOOK_EXIT_CODES.SUCCESS); // = 0 (graceful)
}
return HOOK_EXIT_CODES.SUCCESS;
}
// Handler/client bug — show as blocking error so developers see it
console.error(`Hook error: ${error}`);
// Handler/client bug — log to file instead of stderr (#1181)
logger.error('HOOK', `Hook error: ${error instanceof Error ? error.message : error}`, {}, error instanceof Error ? error : undefined);
if (!options.skipExit) {
process.exit(HOOK_EXIT_CODES.BLOCKING_ERROR); // = 2
}
return HOOK_EXIT_CODES.BLOCKING_ERROR;
} finally {
// Restore stderr for non-hook code paths (e.g., when skipExit is true and process continues as worker)
process.stderr.write = originalStderrWrite;
}
}
+16 -6
View File
@@ -115,12 +115,22 @@ export async function httpShutdown(port: number): Promise<boolean> {
/**
* Get the plugin version from the installed marketplace package.json
* This is the "expected" version that should be running
* This is the "expected" version that should be running.
* Returns 'unknown' on ENOENT/EBUSY (shutdown race condition, fix #1042).
*/
export function getInstalledPluginVersion(): string {
const packageJsonPath = path.join(MARKETPLACE_ROOT, 'package.json');
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
return packageJson.version;
try {
const packageJsonPath = path.join(MARKETPLACE_ROOT, 'package.json');
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
return packageJson.version;
} catch (error: unknown) {
const code = (error as NodeJS.ErrnoException).code;
if (code === 'ENOENT' || code === 'EBUSY') {
logger.debug('SYSTEM', 'Could not read plugin version (shutdown race)', { code });
return 'unknown';
}
throw error;
}
}
/**
@@ -155,8 +165,8 @@ export async function checkVersionMatch(port: number): Promise<VersionCheckResul
const pluginVersion = getInstalledPluginVersion();
const workerVersion = await getRunningWorkerVersion(port);
// If we can't get worker version, assume it matches (graceful degradation)
if (!workerVersion) {
// If either version is unknown/null, assume match (graceful degradation, fix #1042)
if (!workerVersion || pluginVersion === 'unknown') {
return { matches: true, pluginVersion, workerVersion };
}
+59 -22
View File
@@ -10,7 +10,7 @@
import path from 'path';
import { homedir } from 'os';
import { existsSync, writeFileSync, readFileSync, unlinkSync, mkdirSync, rmSync } from 'fs';
import { existsSync, writeFileSync, readFileSync, unlinkSync, mkdirSync, rmSync, statSync, utimesSync } from 'fs';
import { exec, execSync, spawn } from 'child_process';
import { promisify } from 'util';
import { logger } from '../../utils/logger.js';
@@ -54,7 +54,8 @@ function lookupBinaryInPath(binaryName: string, platform: NodeJS.Platform): stri
try {
const output = execSync(command, {
stdio: ['ignore', 'pipe', 'ignore'],
encoding: 'utf-8'
encoding: 'utf-8',
windowsHide: true
});
const firstMatch = output
@@ -191,10 +192,10 @@ export async function getChildProcesses(parentPid: number): Promise<number[]> {
}
try {
// PowerShell Get-Process instead of WMIC (deprecated in Windows 11)
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-Process | Where-Object { $_.ParentProcessId -eq ${parentPid} } | Select-Object -ExpandProperty Id"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND });
// PowerShell outputs just numbers (one per line), simpler than WMIC's "ProcessId=1234" format
// Use WQL -Filter to avoid $_ pipeline syntax that breaks in Git Bash (#1062, #1024).
// Get-CimInstance with server-side filtering is also more efficient than piping through Where-Object.
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-CimInstance Win32_Process -Filter 'ParentProcessId=${parentPid}' | Select-Object -ExpandProperty ProcessId"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, windowsHide: true });
return stdout
.split('\n')
.map(line => line.trim())
@@ -223,7 +224,7 @@ export async function forceKillProcess(pid: number): Promise<void> {
try {
if (process.platform === 'win32') {
// /T kills entire process tree, /F forces termination
await execAsync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND });
await execAsync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, windowsHide: true });
} else {
process.kill(pid, 'SIGKILL');
}
@@ -315,13 +316,14 @@ export async function cleanupOrphanedProcesses(): Promise<void> {
try {
if (isWindows) {
// Windows: Use PowerShell Get-CimInstance with JSON output for age filtering
const patternConditions = ORPHAN_PROCESS_PATTERNS
.map(p => `$_.CommandLine -like '*${p}*'`)
.join(' -or ');
// Windows: Use WQL -Filter for server-side filtering (no $_ pipeline syntax).
// Avoids Git Bash $_ interpretation (#1062) and PowerShell syntax errors (#1024).
const wqlPatternConditions = ORPHAN_PROCESS_PATTERNS
.map(p => `CommandLine LIKE '%${p}%'`)
.join(' OR ');
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-CimInstance Win32_Process | Where-Object { (${patternConditions}) -and $_.ProcessId -ne ${currentPid} } | Select-Object ProcessId, CreationDate | ConvertTo-Json"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND });
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-CimInstance Win32_Process -Filter '(${wqlPatternConditions}) AND ProcessId != ${currentPid}' | Select-Object ProcessId, CreationDate | ConvertTo-Json"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, windowsHide: true });
if (!stdout.trim() || stdout.trim() === 'null') {
logger.debug('SYSTEM', 'No orphaned claude-mem processes found (Windows)');
@@ -406,7 +408,7 @@ export async function cleanupOrphanedProcesses(): Promise<void> {
continue;
}
try {
execSync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, stdio: 'ignore' });
execSync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, stdio: 'ignore', windowsHide: true });
} catch (error) {
// [ANTI-PATTERN IGNORED]: Cleanup loop - process may have exited, continue to next PID
logger.debug('SYSTEM', 'Failed to kill process, may have already exited', { pid }, error as Error);
@@ -451,12 +453,14 @@ export async function aggressiveStartupCleanup(): Promise<void> {
try {
if (isWindows) {
const patternConditions = allPatterns
.map(p => `$_.CommandLine -like '*${p}*'`)
.join(' -or ');
// Use WQL -Filter for server-side filtering (no $_ pipeline syntax).
// Avoids Git Bash $_ interpretation (#1062) and PowerShell syntax errors (#1024).
const wqlPatternConditions = allPatterns
.map(p => `CommandLine LIKE '%${p}%'`)
.join(' OR ');
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-CimInstance Win32_Process | Where-Object { (${patternConditions}) -and $_.ProcessId -ne ${currentPid} } | Select-Object ProcessId, CommandLine, CreationDate | ConvertTo-Json"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND });
const cmd = `powershell -NoProfile -NonInteractive -Command "Get-CimInstance Win32_Process -Filter '(${wqlPatternConditions}) AND ProcessId != ${currentPid}' | Select-Object ProcessId, CommandLine, CreationDate | ConvertTo-Json"`;
const { stdout } = await execAsync(cmd, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, windowsHide: true });
if (!stdout.trim() || stdout.trim() === 'null') {
logger.debug('SYSTEM', 'No orphaned claude-mem processes found (Windows)');
@@ -549,7 +553,7 @@ export async function aggressiveStartupCleanup(): Promise<void> {
for (const pid of pidsToKill) {
if (!Number.isInteger(pid) || pid <= 0) continue;
try {
execSync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, stdio: 'ignore' });
execSync(`taskkill /PID ${pid} /T /F`, { timeout: HOOK_TIMEOUTS.POWERSHELL_COMMAND, stdio: 'ignore', windowsHide: true });
} catch (error) {
logger.debug('SYSTEM', 'Failed to kill process, may have already exited', { pid }, error as Error);
}
@@ -699,10 +703,10 @@ export function spawnDaemon(
*
* EPERM is treated as "alive" because it means the process exists but
* belongs to a different user/session (common in multi-user setups).
* PID 0 (Windows WMIC sentinel for unknown PID) is treated as alive.
* PID 0 (Windows sentinel for unknown PID) is treated as alive.
*/
export function isProcessAlive(pid: number): boolean {
// PID 0 is the Windows WMIC sentinel value — process was spawned but PID unknown
// PID 0 is the Windows sentinel value — process was spawned but PID unknown
if (pid === 0) return true;
// Invalid PIDs are not alive
@@ -720,6 +724,39 @@ export function isProcessAlive(pid: number): boolean {
}
}
/**
* Check if the PID file was written recently (within thresholdMs).
*
* Used to coordinate restarts across concurrent sessions: if the PID file
* was recently written, another session likely just restarted the worker.
* Callers should poll /api/health instead of attempting their own restart.
*
* @param thresholdMs - Maximum age in ms to consider "recent" (default: 15000)
* @returns true if the PID file exists and was modified within thresholdMs
*/
export function isPidFileRecent(thresholdMs: number = 15000): boolean {
try {
const stats = statSync(PID_FILE);
return (Date.now() - stats.mtimeMs) < thresholdMs;
} catch {
return false;
}
}
/**
* Touch the PID file to update its mtime without changing contents.
* Used after a restart to signal other sessions that a restart just completed.
*/
export function touchPidFile(): void {
try {
if (!existsSync(PID_FILE)) return;
const now = new Date();
utimesSync(PID_FILE, now, now);
} catch {
// Best-effort — failure to touch doesn't affect correctness
}
}
/**
* Read the PID file and remove it if the recorded process is dead (stale).
*
+14 -1
View File
@@ -398,9 +398,22 @@ export class PendingMessageStore {
}
/**
* Check if any session has pending work
* Check if any session has pending work.
* Excludes 'processing' messages stuck for >5 minutes (resets them to 'pending' as a side effect).
*/
hasAnyPendingWork(): boolean {
// Reset stuck 'processing' messages older than 5 minutes before checking
const stuckCutoff = Date.now() - (5 * 60 * 1000);
const resetStmt = this.db.prepare(`
UPDATE pending_messages
SET status = 'pending', started_processing_at_epoch = NULL
WHERE status = 'processing' AND started_processing_at_epoch < ?
`);
const resetResult = resetStmt.run(stuckCutoff);
if (resetResult.changes > 0) {
logger.info('QUEUE', `STUCK_RESET | hasAnyPendingWork reset ${resetResult.changes} stuck processing message(s) older than 5 minutes`);
}
const stmt = this.db.prepare(`
SELECT COUNT(*) as count FROM pending_messages
WHERE status IN ('pending', 'processing')
+102 -72
View File
@@ -46,6 +46,10 @@ export class SessionSearch {
* - Tables maintained but search paths removed
* - Triggers still fire to keep tables synchronized
*
* FTS5 may be unavailable on some platforms (e.g., Bun on Windows #791).
* When unavailable, we skip FTS table creation — search falls back to
* ChromaDB (vector) and LIKE queries (structured filters) which are unaffected.
*
* TODO: Remove FTS5 infrastructure in future major version (v7.0.0)
*/
private ensureFTSTables(): void {
@@ -58,91 +62,117 @@ export class SessionSearch {
return;
}
// Runtime check: verify FTS5 is available before attempting to create tables.
// bun:sqlite on Windows may not include the FTS5 extension (#791).
if (!this.isFts5Available()) {
logger.warn('DB', 'FTS5 not available on this platform — skipping FTS table creation (search uses ChromaDB)');
return;
}
logger.info('DB', 'Creating FTS5 tables');
// Create observations_fts virtual table
this.db.run(`
CREATE VIRTUAL TABLE IF NOT EXISTS observations_fts USING fts5(
title,
subtitle,
narrative,
text,
facts,
concepts,
content='observations',
content_rowid='id'
);
`);
try {
// Create observations_fts virtual table
this.db.run(`
CREATE VIRTUAL TABLE IF NOT EXISTS observations_fts USING fts5(
title,
subtitle,
narrative,
text,
facts,
concepts,
content='observations',
content_rowid='id'
);
`);
// Populate with existing data
this.db.run(`
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
SELECT id, title, subtitle, narrative, text, facts, concepts
FROM observations;
`);
// Create triggers for observations
this.db.run(`
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
// Populate with existing data
this.db.run(`
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
SELECT id, title, subtitle, narrative, text, facts, concepts
FROM observations;
`);
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
END;
// Create triggers for observations
this.db.run(`
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
`);
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
END;
// Create session_summaries_fts virtual table
this.db.run(`
CREATE VIRTUAL TABLE IF NOT EXISTS session_summaries_fts USING fts5(
request,
investigated,
learned,
completed,
next_steps,
notes,
content='session_summaries',
content_rowid='id'
);
`);
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
`);
// Populate with existing data
this.db.run(`
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
SELECT id, request, investigated, learned, completed, next_steps, notes
FROM session_summaries;
`);
// Create session_summaries_fts virtual table
this.db.run(`
CREATE VIRTUAL TABLE IF NOT EXISTS session_summaries_fts USING fts5(
request,
investigated,
learned,
completed,
next_steps,
notes,
content='session_summaries',
content_rowid='id'
);
`);
// Create triggers for session_summaries
this.db.run(`
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
// Populate with existing data
this.db.run(`
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
SELECT id, request, investigated, learned, completed, next_steps, notes
FROM session_summaries;
`);
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
END;
// Create triggers for session_summaries
this.db.run(`
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
`);
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
END;
logger.info('DB', 'FTS5 tables created successfully');
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
`);
logger.info('DB', 'FTS5 tables created successfully');
} catch (error) {
// FTS5 creation failed at runtime despite probe succeeding — degrade gracefully
logger.warn('DB', 'FTS5 table creation failed — search will use ChromaDB and LIKE queries', {}, error as Error);
}
}
/**
* Probe whether the FTS5 extension is available in the current SQLite build.
* Creates and immediately drops a temporary FTS5 table.
*/
private isFts5Available(): boolean {
try {
this.db.run('CREATE VIRTUAL TABLE _fts5_probe USING fts5(test_column)');
this.db.run('DROP TABLE _fts5_probe');
return true;
} catch {
return false;
}
}
+183 -117
View File
@@ -13,6 +13,7 @@ import {
LatestPromptResult
} from '../../types/database.js';
import type { PendingMessageStore } from './PendingMessageStore.js';
import { computeObservationContentHash, findDuplicateObservation } from './observations/store.js';
/**
* Session data store for SDK sessions, observations, and summaries
@@ -48,11 +49,17 @@ export class SessionStore {
this.repairSessionIdColumnRename();
this.addFailedAtEpochColumn();
this.addOnUpdateCascadeToForeignKeys();
this.addObservationContentHashColumn();
this.addSessionCustomTitleColumn();
}
/**
* Initialize database schema using migrations (migration004)
* This runs the core SDK tables migration if no tables exist
* Initialize database schema (migration004)
*
* ALWAYS creates core tables using CREATE TABLE IF NOT EXISTS — safe to run
* regardless of schema_versions state. This fixes issue #979 where the old
* DatabaseManager migration system (versions 1-7) shared the schema_versions
* table, causing maxApplied > 0 and skipping core table creation entirely.
*/
private initializeSchema(): void {
// Create schema_versions table if it doesn't exist
@@ -64,90 +71,77 @@ export class SessionStore {
)
`);
// Get applied migrations
const appliedVersions = this.db.prepare('SELECT version FROM schema_versions ORDER BY version').all() as SchemaVersion[];
const maxApplied = appliedVersions.length > 0 ? Math.max(...appliedVersions.map(v => v.version)) : 0;
// Always create core tables — IF NOT EXISTS makes this idempotent
this.db.run(`
CREATE TABLE IF NOT EXISTS sdk_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content_session_id TEXT UNIQUE NOT NULL,
memory_session_id TEXT UNIQUE,
project TEXT NOT NULL,
user_prompt TEXT,
started_at TEXT NOT NULL,
started_at_epoch INTEGER NOT NULL,
completed_at TEXT,
completed_at_epoch INTEGER,
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
);
// Only run migration004 if no migrations have been applied
// This creates the sdk_sessions, observations, and session_summaries tables
if (maxApplied === 0) {
logger.info('DB', 'Initializing fresh database with migration004');
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
// Migration004: SDK agent architecture tables
this.db.run(`
CREATE TABLE IF NOT EXISTS sdk_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content_session_id TEXT UNIQUE NOT NULL,
memory_session_id TEXT UNIQUE,
project TEXT NOT NULL,
user_prompt TEXT,
started_at TEXT NOT NULL,
started_at_epoch INTEGER NOT NULL,
completed_at TEXT,
completed_at_epoch INTEGER,
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
);
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT NOT NULL,
type TEXT NOT NULL,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT NOT NULL,
type TEXT NOT NULL,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE TABLE IF NOT EXISTS session_summaries (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT UNIQUE NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`);
CREATE TABLE IF NOT EXISTS session_summaries (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT UNIQUE NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`);
// Record migration004 as applied
this.db.prepare('INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)').run(4, new Date().toISOString());
logger.info('DB', 'Migration004 applied successfully');
}
// Record migration004 as applied (OR IGNORE handles re-runs safely)
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(4, new Date().toISOString());
}
/**
* Ensure worker_port column exists (migration 5)
*
* NOTE: Version 5 conflicts with old DatabaseManager migration005 (which drops orphaned tables).
* We check actual column state rather than relying solely on version tracking.
*/
private ensureWorkerPortColumn(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(5) as SchemaVersion | undefined;
if (applied) return;
// Check if column exists
// Check actual column existence — don't rely on version tracking alone (issue #979)
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasWorkerPort = tableInfo.some(col => col.name === 'worker_port');
@@ -162,12 +156,12 @@ export class SessionStore {
/**
* Ensure prompt tracking columns exist (migration 6)
*
* NOTE: Version 6 conflicts with old DatabaseManager migration006 (which creates FTS5 tables).
* We check actual column state rather than relying solely on version tracking.
*/
private ensurePromptTrackingColumns(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(6) as SchemaVersion | undefined;
if (applied) return;
// Check actual column existence — don't rely on version tracking alone (issue #979)
// Check sdk_sessions for prompt_counter
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasPromptCounter = sessionsInfo.some(col => col.name === 'prompt_counter');
@@ -201,13 +195,12 @@ export class SessionStore {
/**
* Remove UNIQUE constraint from session_summaries.memory_session_id (migration 7)
*
* NOTE: Version 7 conflicts with old DatabaseManager migration007 (which adds discovery_tokens).
* We check actual constraint state rather than relying solely on version tracking.
*/
private removeSessionSummariesUniqueConstraint(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(7) as SchemaVersion | undefined;
if (applied) return;
// Check if UNIQUE constraint exists
// Check actual constraint state — don't rely on version tracking alone (issue #979)
const summariesIndexes = this.db.query('PRAGMA index_list(session_summaries)').all() as IndexInfo[];
const hasUniqueConstraint = summariesIndexes.some(idx => idx.unique === 1);
@@ -222,6 +215,9 @@ export class SessionStore {
// Begin transaction
this.db.run('BEGIN TRANSACTION');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS session_summaries_new');
// Create new table without UNIQUE constraint
this.db.run(`
CREATE TABLE session_summaries_new (
@@ -335,6 +331,9 @@ export class SessionStore {
// Begin transaction
this.db.run('BEGIN TRANSACTION');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS observations_new');
// Create new table with text as nullable
this.db.run(`
CREATE TABLE observations_new (
@@ -428,34 +427,39 @@ export class SessionStore {
CREATE INDEX idx_user_prompts_lookup ON user_prompts(content_session_id, prompt_number);
`);
// Create FTS5 virtual table
this.db.run(`
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
prompt_text,
content='user_prompts',
content_rowid='id'
);
`);
// Create FTS5 virtual table — skip if FTS5 is unavailable (e.g., Bun on Windows #791).
// The user_prompts table itself is still created; only FTS indexing is skipped.
try {
this.db.run(`
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
prompt_text,
content='user_prompts',
content_rowid='id'
);
`);
// Create triggers to sync FTS5
this.db.run(`
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
// Create triggers to sync FTS5
this.db.run(`
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
END;
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
END;
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
`);
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
`);
} catch (ftsError) {
logger.warn('DB', 'FTS5 not available — user_prompts_fts skipped (search uses ChromaDB)', {}, ftsError as Error);
}
// Commit transaction
this.db.run('COMMIT');
@@ -463,7 +467,7 @@ export class SessionStore {
// Record migration
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
logger.debug('DB', 'Successfully created user_prompts table with FTS5 support');
logger.debug('DB', 'Successfully created user_prompts table');
}
/**
@@ -675,6 +679,9 @@ export class SessionStore {
this.db.run('DROP TRIGGER IF EXISTS observations_ad');
this.db.run('DROP TRIGGER IF EXISTS observations_au');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS observations_new');
this.db.run(`
CREATE TABLE observations_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -744,6 +751,9 @@ export class SessionStore {
// 2. Recreate session_summaries table
// ==========================================
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS session_summaries_new');
this.db.run(`
CREATE TABLE session_summaries_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -825,6 +835,44 @@ export class SessionStore {
}
}
/**
* Add content_hash column to observations for deduplication (migration 22)
*/
private addObservationContentHashColumn(): void {
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(22) as SchemaVersion | undefined;
if (applied) return;
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
const hasColumn = tableInfo.some(col => col.name === 'content_hash');
if (!hasColumn) {
this.db.run('ALTER TABLE observations ADD COLUMN content_hash TEXT');
this.db.run("UPDATE observations SET content_hash = substr(hex(randomblob(8)), 1, 16) WHERE content_hash IS NULL");
this.db.run('CREATE INDEX IF NOT EXISTS idx_observations_content_hash ON observations(content_hash, created_at_epoch)');
logger.debug('DB', 'Added content_hash column to observations table with backfill and index');
}
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(22, new Date().toISOString());
}
/**
* Add custom_title column to sdk_sessions for agent attribution (migration 23)
*/
private addSessionCustomTitleColumn(): void {
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(23) as SchemaVersion | undefined;
if (applied) return;
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasColumn = tableInfo.some(col => col.name === 'custom_title');
if (!hasColumn) {
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN custom_title TEXT');
logger.debug('DB', 'Added custom_title column to sdk_sessions table');
}
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(23, new Date().toISOString());
}
/**
* Update the memory session ID for a session
* Called by SDKAgent when it captures the session ID from the first SDK message
@@ -1290,9 +1338,10 @@ export class SessionStore {
memory_session_id: string | null;
project: string;
user_prompt: string;
custom_title: string | null;
} | null {
const stmt = this.db.prepare(`
SELECT id, content_session_id, memory_session_id, project, user_prompt
SELECT id, content_session_id, memory_session_id, project, user_prompt, custom_title
FROM sdk_sessions
WHERE id = ?
LIMIT 1
@@ -1311,6 +1360,7 @@ export class SessionStore {
memory_session_id: string;
project: string;
user_prompt: string;
custom_title: string | null;
started_at: string;
started_at_epoch: number;
completed_at: string | null;
@@ -1321,7 +1371,7 @@ export class SessionStore {
const placeholders = memorySessionIds.map(() => '?').join(',');
const stmt = this.db.prepare(`
SELECT id, content_session_id, memory_session_id, project, user_prompt,
SELECT id, content_session_id, memory_session_id, project, user_prompt, custom_title,
started_at, started_at_epoch, completed_at, completed_at_epoch, status
FROM sdk_sessions
WHERE memory_session_id IN (${placeholders})
@@ -1366,7 +1416,7 @@ export class SessionStore {
* Pure get-or-create: never modifies memory_session_id.
* Multi-terminal isolation is handled by ON UPDATE CASCADE at the schema level.
*/
createSDKSession(contentSessionId: string, project: string, userPrompt: string): number {
createSDKSession(contentSessionId: string, project: string, userPrompt: string, customTitle?: string): number {
const now = new Date();
const nowEpoch = now.getTime();
@@ -1383,6 +1433,13 @@ export class SessionStore {
WHERE content_session_id = ? AND (project IS NULL OR project = '')
`).run(project, contentSessionId);
}
// Backfill custom_title if provided and not yet set
if (customTitle) {
this.db.prepare(`
UPDATE sdk_sessions SET custom_title = ?
WHERE content_session_id = ? AND custom_title IS NULL
`).run(customTitle, contentSessionId);
}
return existing.id;
}
@@ -1392,9 +1449,9 @@ export class SessionStore {
// must NEVER equal contentSessionId - that would inject memory messages into the user's transcript!
this.db.prepare(`
INSERT INTO sdk_sessions
(content_session_id, memory_session_id, project, user_prompt, started_at, started_at_epoch, status)
VALUES (?, NULL, ?, ?, ?, ?, 'active')
`).run(contentSessionId, project, userPrompt, now.toISOString(), nowEpoch);
(content_session_id, memory_session_id, project, user_prompt, custom_title, started_at, started_at_epoch, status)
VALUES (?, NULL, ?, ?, ?, ?, ?, 'active')
`).run(contentSessionId, project, userPrompt, customTitle || null, now.toISOString(), nowEpoch);
// Return new ID
const row = this.db.prepare('SELECT id FROM sdk_sessions WHERE content_session_id = ?')
@@ -1441,6 +1498,7 @@ export class SessionStore {
/**
* Store an observation (from SDK parsing)
* Assumes session already exists (created by hook)
* Performs content-hash deduplication: skips INSERT if an identical observation exists within 30s
*/
storeObservation(
memorySessionId: string,
@@ -1463,11 +1521,18 @@ export class SessionStore {
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
const timestampIso = new Date(timestampEpoch).toISOString();
// Content-hash deduplication
const contentHash = computeObservationContentHash(memorySessionId, observation.title, observation.narrative);
const existing = findDuplicateObservation(this.db, contentHash, timestampEpoch);
if (existing) {
return { id: existing.id, createdAtEpoch: existing.created_at_epoch };
}
const stmt = this.db.prepare(`
INSERT INTO observations
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
files_read, files_modified, prompt_number, discovery_tokens, content_hash, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
const result = stmt.run(
@@ -1483,6 +1548,7 @@ export class SessionStore {
JSON.stringify(observation.files_modified),
promptNumber || null,
discoveryTokens,
contentHash,
timestampIso,
timestampEpoch
);
+10
View File
@@ -372,6 +372,16 @@ export const migration005: Migration = {
export const migration006: Migration = {
version: 6,
up: (db: Database) => {
// FTS5 may be unavailable on some platforms (e.g., Bun on Windows #791).
// Probe before creating tables — search falls back to ChromaDB when unavailable.
try {
db.run('CREATE VIRTUAL TABLE _fts5_probe USING fts5(test_column)');
db.run('DROP TABLE _fts5_probe');
} catch {
console.log('⚠️ FTS5 not available on this platform — skipping FTS migration (search uses ChromaDB)');
return;
}
// FTS5 virtual table for observations
// Note: This assumes the hierarchical fields (title, subtitle, etc.) already exist
// from the inline migrations in SessionStore constructor
+340 -109
View File
@@ -31,11 +31,18 @@ export class MigrationRunner {
this.renameSessionIdColumns();
this.repairSessionIdColumnRename();
this.addFailedAtEpochColumn();
this.addOnUpdateCascadeToForeignKeys();
this.addObservationContentHashColumn();
this.addSessionCustomTitleColumn();
}
/**
* Initialize database schema using migrations (migration004)
* This runs the core SDK tables migration if no tables exist
* Initialize database schema (migration004)
*
* ALWAYS creates core tables using CREATE TABLE IF NOT EXISTS — safe to run
* regardless of schema_versions state. This fixes issue #979 where the old
* DatabaseManager migration system (versions 1-7) shared the schema_versions
* table, causing maxApplied > 0 and skipping core table creation entirely.
*/
private initializeSchema(): void {
// Create schema_versions table if it doesn't exist
@@ -47,90 +54,77 @@ export class MigrationRunner {
)
`);
// Get applied migrations
const appliedVersions = this.db.prepare('SELECT version FROM schema_versions ORDER BY version').all() as SchemaVersion[];
const maxApplied = appliedVersions.length > 0 ? Math.max(...appliedVersions.map(v => v.version)) : 0;
// Always create core tables — IF NOT EXISTS makes this idempotent
this.db.run(`
CREATE TABLE IF NOT EXISTS sdk_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content_session_id TEXT UNIQUE NOT NULL,
memory_session_id TEXT UNIQUE,
project TEXT NOT NULL,
user_prompt TEXT,
started_at TEXT NOT NULL,
started_at_epoch INTEGER NOT NULL,
completed_at TEXT,
completed_at_epoch INTEGER,
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
);
// Only run migration004 if no migrations have been applied
// This creates the sdk_sessions, observations, and session_summaries tables
if (maxApplied === 0) {
logger.info('DB', 'Initializing fresh database with migration004');
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
// Migration004: SDK agent architecture tables
this.db.run(`
CREATE TABLE IF NOT EXISTS sdk_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content_session_id TEXT UNIQUE NOT NULL,
memory_session_id TEXT UNIQUE,
project TEXT NOT NULL,
user_prompt TEXT,
started_at TEXT NOT NULL,
started_at_epoch INTEGER NOT NULL,
completed_at TEXT,
completed_at_epoch INTEGER,
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
);
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT NOT NULL,
type TEXT NOT NULL,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT NOT NULL,
type TEXT NOT NULL,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS session_summaries (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT UNIQUE NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`);
CREATE TABLE IF NOT EXISTS session_summaries (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT UNIQUE NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`);
// Record migration004 as applied
this.db.prepare('INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)').run(4, new Date().toISOString());
logger.info('DB', 'Migration004 applied successfully');
}
// Record migration004 as applied (OR IGNORE handles re-runs safely)
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(4, new Date().toISOString());
}
/**
* Ensure worker_port column exists (migration 5)
*
* NOTE: Version 5 conflicts with old DatabaseManager migration005 (which drops orphaned tables).
* We check actual column state rather than relying solely on version tracking.
*/
private ensureWorkerPortColumn(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(5) as SchemaVersion | undefined;
if (applied) return;
// Check if column exists
// Check actual column existence — don't rely on version tracking alone (issue #979)
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasWorkerPort = tableInfo.some(col => col.name === 'worker_port');
@@ -145,12 +139,12 @@ export class MigrationRunner {
/**
* Ensure prompt tracking columns exist (migration 6)
*
* NOTE: Version 6 conflicts with old DatabaseManager migration006 (which creates FTS5 tables).
* We check actual column state rather than relying solely on version tracking.
*/
private ensurePromptTrackingColumns(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(6) as SchemaVersion | undefined;
if (applied) return;
// Check actual column existence — don't rely on version tracking alone (issue #979)
// Check sdk_sessions for prompt_counter
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasPromptCounter = sessionsInfo.some(col => col.name === 'prompt_counter');
@@ -184,13 +178,12 @@ export class MigrationRunner {
/**
* Remove UNIQUE constraint from session_summaries.memory_session_id (migration 7)
*
* NOTE: Version 7 conflicts with old DatabaseManager migration007 (which adds discovery_tokens).
* We check actual constraint state rather than relying solely on version tracking.
*/
private removeSessionSummariesUniqueConstraint(): void {
// Check if migration already applied
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(7) as SchemaVersion | undefined;
if (applied) return;
// Check if UNIQUE constraint exists
// Check actual constraint state — don't rely on version tracking alone (issue #979)
const summariesIndexes = this.db.query('PRAGMA index_list(session_summaries)').all() as IndexInfo[];
const hasUniqueConstraint = summariesIndexes.some(idx => idx.unique === 1);
@@ -205,6 +198,9 @@ export class MigrationRunner {
// Begin transaction
this.db.run('BEGIN TRANSACTION');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS session_summaries_new');
// Create new table without UNIQUE constraint
this.db.run(`
CREATE TABLE session_summaries_new (
@@ -318,6 +314,9 @@ export class MigrationRunner {
// Begin transaction
this.db.run('BEGIN TRANSACTION');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS observations_new');
// Create new table with text as nullable
this.db.run(`
CREATE TABLE observations_new (
@@ -411,34 +410,39 @@ export class MigrationRunner {
CREATE INDEX idx_user_prompts_lookup ON user_prompts(content_session_id, prompt_number);
`);
// Create FTS5 virtual table
this.db.run(`
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
prompt_text,
content='user_prompts',
content_rowid='id'
);
`);
// Create FTS5 virtual table — skip if FTS5 is unavailable (e.g., Bun on Windows #791).
// The user_prompts table itself is still created; only FTS indexing is skipped.
try {
this.db.run(`
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
prompt_text,
content='user_prompts',
content_rowid='id'
);
`);
// Create triggers to sync FTS5
this.db.run(`
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
// Create triggers to sync FTS5
this.db.run(`
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
END;
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
END;
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
`);
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
`);
} catch (ftsError) {
logger.warn('DB', 'FTS5 not available — user_prompts_fts skipped (search uses ChromaDB)', {}, ftsError as Error);
}
// Commit transaction
this.db.run('COMMIT');
@@ -446,7 +450,7 @@ export class MigrationRunner {
// Record migration
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
logger.debug('DB', 'Successfully created user_prompts table with FTS5 support');
logger.debug('DB', 'Successfully created user_prompts table');
}
/**
@@ -628,4 +632,231 @@ export class MigrationRunner {
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(20, new Date().toISOString());
}
/**
* Add ON UPDATE CASCADE to FK constraints on observations and session_summaries (migration 21)
*
* Both tables have FK(memory_session_id) -> sdk_sessions(memory_session_id) with ON DELETE CASCADE
* but missing ON UPDATE CASCADE. This causes FK constraint violations when code updates
* sdk_sessions.memory_session_id while child rows still reference the old value.
*
* SQLite doesn't support ALTER TABLE for FK changes, so we recreate both tables.
*/
private addOnUpdateCascadeToForeignKeys(): void {
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(21) as SchemaVersion | undefined;
if (applied) return;
logger.debug('DB', 'Adding ON UPDATE CASCADE to FK constraints on observations and session_summaries');
// PRAGMA foreign_keys must be set outside a transaction
this.db.run('PRAGMA foreign_keys = OFF');
this.db.run('BEGIN TRANSACTION');
try {
// ==========================================
// 1. Recreate observations table
// ==========================================
// Drop FTS triggers first (they reference the observations table)
this.db.run('DROP TRIGGER IF EXISTS observations_ai');
this.db.run('DROP TRIGGER IF EXISTS observations_ad');
this.db.run('DROP TRIGGER IF EXISTS observations_au');
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS observations_new');
this.db.run(`
CREATE TABLE observations_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT,
type TEXT NOT NULL,
title TEXT,
subtitle TEXT,
facts TEXT,
narrative TEXT,
concepts TEXT,
files_read TEXT,
files_modified TEXT,
prompt_number INTEGER,
discovery_tokens INTEGER DEFAULT 0,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
)
`);
this.db.run(`
INSERT INTO observations_new
SELECT id, memory_session_id, project, text, type, title, subtitle, facts,
narrative, concepts, files_read, files_modified, prompt_number,
discovery_tokens, created_at, created_at_epoch
FROM observations
`);
this.db.run('DROP TABLE observations');
this.db.run('ALTER TABLE observations_new RENAME TO observations');
// Recreate indexes
this.db.run(`
CREATE INDEX idx_observations_sdk_session ON observations(memory_session_id);
CREATE INDEX idx_observations_project ON observations(project);
CREATE INDEX idx_observations_type ON observations(type);
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
`);
// Recreate FTS triggers only if observations_fts exists
const hasFTS = (this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name='observations_fts'").all() as { name: string }[]).length > 0;
if (hasFTS) {
this.db.run(`
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
END;
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
`);
}
// ==========================================
// 2. Recreate session_summaries table
// ==========================================
// Clean up leftover temp table from a previously-crashed run
this.db.run('DROP TABLE IF EXISTS session_summaries_new');
this.db.run(`
CREATE TABLE session_summaries_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
memory_session_id TEXT NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
prompt_number INTEGER,
discovery_tokens INTEGER DEFAULT 0,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE ON UPDATE CASCADE
)
`);
this.db.run(`
INSERT INTO session_summaries_new
SELECT id, memory_session_id, project, request, investigated, learned,
completed, next_steps, files_read, files_edited, notes,
prompt_number, discovery_tokens, created_at, created_at_epoch
FROM session_summaries
`);
// Drop session_summaries FTS triggers before dropping the table
this.db.run('DROP TRIGGER IF EXISTS session_summaries_ai');
this.db.run('DROP TRIGGER IF EXISTS session_summaries_ad');
this.db.run('DROP TRIGGER IF EXISTS session_summaries_au');
this.db.run('DROP TABLE session_summaries');
this.db.run('ALTER TABLE session_summaries_new RENAME TO session_summaries');
// Recreate indexes
this.db.run(`
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`);
// Recreate session_summaries FTS triggers if FTS table exists
const hasSummariesFTS = (this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name='session_summaries_fts'").all() as { name: string }[]).length > 0;
if (hasSummariesFTS) {
this.db.run(`
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
END;
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
`);
}
// Record migration
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(21, new Date().toISOString());
this.db.run('COMMIT');
this.db.run('PRAGMA foreign_keys = ON');
logger.debug('DB', 'Successfully added ON UPDATE CASCADE to FK constraints');
} catch (error) {
this.db.run('ROLLBACK');
this.db.run('PRAGMA foreign_keys = ON');
throw error;
}
}
/**
* Add content_hash column to observations for deduplication (migration 22)
* Prevents duplicate observations from being stored when the same content is processed multiple times.
* Backfills existing rows with unique random hashes so they don't block new inserts.
*/
private addObservationContentHashColumn(): void {
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(22) as SchemaVersion | undefined;
if (applied) return;
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
const hasColumn = tableInfo.some(col => col.name === 'content_hash');
if (!hasColumn) {
this.db.run('ALTER TABLE observations ADD COLUMN content_hash TEXT');
// Backfill existing rows with unique random hashes
this.db.run("UPDATE observations SET content_hash = substr(hex(randomblob(8)), 1, 16) WHERE content_hash IS NULL");
// Index for fast dedup lookups
this.db.run('CREATE INDEX IF NOT EXISTS idx_observations_content_hash ON observations(content_hash, created_at_epoch)');
logger.debug('DB', 'Added content_hash column to observations table with backfill and index');
}
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(22, new Date().toISOString());
}
/**
* Add custom_title column to sdk_sessions for agent attribution (migration 23)
* Allows callers (e.g. Maestro agents) to label sessions with a human-readable name.
*/
private addSessionCustomTitleColumn(): void {
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(23) as SchemaVersion | undefined;
if (applied) return;
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
const hasColumn = tableInfo.some(col => col.name === 'custom_title');
if (!hasColumn) {
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN custom_title TEXT');
logger.debug('DB', 'Added custom_title column to sdk_sessions table');
}
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(23, new Date().toISOString());
}
}
+52 -3
View File
@@ -3,13 +3,50 @@
* Extracted from SessionStore.ts for modular organization
*/
import { createHash } from 'crypto';
import { Database } from 'bun:sqlite';
import { logger } from '../../../utils/logger.js';
import { getCurrentProjectName } from '../../../shared/paths.js';
import type { ObservationInput, StoreObservationResult } from './types.js';
/** Deduplication window: observations with the same content hash within this window are skipped */
const DEDUP_WINDOW_MS = 30_000;
/**
* Compute a short content hash for deduplication.
* Uses (memory_session_id, title, narrative) as the semantic identity of an observation.
*/
export function computeObservationContentHash(
memorySessionId: string,
title: string | null,
narrative: string | null
): string {
return createHash('sha256')
.update((memorySessionId || '') + (title || '') + (narrative || ''))
.digest('hex')
.slice(0, 16);
}
/**
* Check if a duplicate observation exists within the dedup window.
* Returns the existing observation's id and timestamp if found, null otherwise.
*/
export function findDuplicateObservation(
db: Database,
contentHash: string,
timestampEpoch: number
): { id: number; created_at_epoch: number } | null {
const windowStart = timestampEpoch - DEDUP_WINDOW_MS;
const stmt = db.prepare(
'SELECT id, created_at_epoch FROM observations WHERE content_hash = ? AND created_at_epoch > ?'
);
return (stmt.get(contentHash, windowStart) as { id: number; created_at_epoch: number } | null);
}
/**
* Store an observation (from SDK parsing)
* Assumes session already exists (created by hook)
* Performs content-hash deduplication: skips INSERT if an identical observation exists within 30s
*/
export function storeObservation(
db: Database,
@@ -24,16 +61,27 @@ export function storeObservation(
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
const timestampIso = new Date(timestampEpoch).toISOString();
// Guard against empty project string (race condition where project isn't set yet)
const resolvedProject = project || getCurrentProjectName();
// Content-hash deduplication
const contentHash = computeObservationContentHash(memorySessionId, observation.title, observation.narrative);
const existing = findDuplicateObservation(db, contentHash, timestampEpoch);
if (existing) {
logger.debug('DEDUP', `Skipped duplicate observation | contentHash=${contentHash} | existingId=${existing.id}`);
return { id: existing.id, createdAtEpoch: existing.created_at_epoch };
}
const stmt = db.prepare(`
INSERT INTO observations
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
files_read, files_modified, prompt_number, discovery_tokens, content_hash, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
const result = stmt.run(
memorySessionId,
project,
resolvedProject,
observation.type,
observation.title,
observation.subtitle,
@@ -44,6 +92,7 @@ export function storeObservation(
JSON.stringify(observation.files_modified),
promptNumber || null,
discoveryTokens,
contentHash,
timestampIso,
timestampEpoch
);
+12 -4
View File
@@ -21,7 +21,8 @@ export function createSDKSession(
db: Database,
contentSessionId: string,
project: string,
userPrompt: string
userPrompt: string,
customTitle?: string
): number {
const now = new Date();
const nowEpoch = now.getTime();
@@ -39,6 +40,13 @@ export function createSDKSession(
WHERE content_session_id = ? AND (project IS NULL OR project = '')
`).run(project, contentSessionId);
}
// Backfill custom_title if provided and not yet set
if (customTitle) {
db.prepare(`
UPDATE sdk_sessions SET custom_title = ?
WHERE content_session_id = ? AND custom_title IS NULL
`).run(customTitle, contentSessionId);
}
return existing.id;
}
@@ -48,9 +56,9 @@ export function createSDKSession(
// must NEVER equal contentSessionId - that would inject memory messages into the user's transcript!
db.prepare(`
INSERT INTO sdk_sessions
(content_session_id, memory_session_id, project, user_prompt, started_at, started_at_epoch, status)
VALUES (?, NULL, ?, ?, ?, ?, 'active')
`).run(contentSessionId, project, userPrompt, now.toISOString(), nowEpoch);
(content_session_id, memory_session_id, project, user_prompt, custom_title, started_at, started_at_epoch, status)
VALUES (?, NULL, ?, ?, ?, ?, ?, 'active')
`).run(contentSessionId, project, userPrompt, customTitle || null, now.toISOString(), nowEpoch);
// Return new ID
const row = db.prepare('SELECT id FROM sdk_sessions WHERE content_session_id = ?')
+2 -2
View File
@@ -17,7 +17,7 @@ import type {
*/
export function getSessionById(db: Database, id: number): SessionBasic | null {
const stmt = db.prepare(`
SELECT id, content_session_id, memory_session_id, project, user_prompt
SELECT id, content_session_id, memory_session_id, project, user_prompt, custom_title
FROM sdk_sessions
WHERE id = ?
LIMIT 1
@@ -38,7 +38,7 @@ export function getSdkSessionsBySessionIds(
const placeholders = memorySessionIds.map(() => '?').join(',');
const stmt = db.prepare(`
SELECT id, content_session_id, memory_session_id, project, user_prompt,
SELECT id, content_session_id, memory_session_id, project, user_prompt, custom_title,
started_at, started_at_epoch, completed_at, completed_at_epoch, status
FROM sdk_sessions
WHERE memory_session_id IN (${placeholders})
+2
View File
@@ -13,6 +13,7 @@ export interface SessionBasic {
memory_session_id: string | null;
project: string;
user_prompt: string;
custom_title: string | null;
}
/**
@@ -24,6 +25,7 @@ export interface SessionFull {
memory_session_id: string;
project: string;
user_prompt: string;
custom_title: string | null;
started_at: string;
started_at_epoch: number;
completed_at: string | null;
+23 -6
View File
@@ -10,6 +10,7 @@ import { Database } from 'bun:sqlite';
import { logger } from '../../utils/logger.js';
import type { ObservationInput } from './observations/types.js';
import type { SummaryInput } from './summaries/types.js';
import { computeObservationContentHash, findDuplicateObservation } from './observations/store.js';
/**
* Result from storeObservations / storeObservationsAndMarkComplete transaction
@@ -63,15 +64,22 @@ export function storeObservationsAndMarkComplete(
const storeAndMarkTx = db.transaction(() => {
const observationIds: number[] = [];
// 1. Store all observations
// 1. Store all observations (with content-hash deduplication)
const obsStmt = db.prepare(`
INSERT INTO observations
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
files_read, files_modified, prompt_number, discovery_tokens, content_hash, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
for (const observation of observations) {
const contentHash = computeObservationContentHash(memorySessionId, observation.title, observation.narrative);
const existing = findDuplicateObservation(db, contentHash, timestampEpoch);
if (existing) {
observationIds.push(existing.id);
continue;
}
const result = obsStmt.run(
memorySessionId,
project,
@@ -85,6 +93,7 @@ export function storeObservationsAndMarkComplete(
JSON.stringify(observation.files_modified),
promptNumber || null,
discoveryTokens,
contentHash,
timestampIso,
timestampEpoch
);
@@ -174,15 +183,22 @@ export function storeObservations(
const storeTx = db.transaction(() => {
const observationIds: number[] = [];
// 1. Store all observations
// 1. Store all observations (with content-hash deduplication)
const obsStmt = db.prepare(`
INSERT INTO observations
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
files_read, files_modified, prompt_number, discovery_tokens, content_hash, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
for (const observation of observations) {
const contentHash = computeObservationContentHash(memorySessionId, observation.title, observation.narrative);
const existing = findDuplicateObservation(db, contentHash, timestampEpoch);
if (existing) {
observationIds.push(existing.id);
continue;
}
const result = obsStmt.run(
memorySessionId,
project,
@@ -196,6 +212,7 @@ export function storeObservations(
JSON.stringify(observation.files_modified),
promptNumber || null,
discoveryTokens,
contentHash,
timestampIso,
timestampEpoch
);
+44 -10
View File
@@ -102,17 +102,23 @@ export class ChromaMcpManager {
const commandArgs = this.buildCommandArgs();
const spawnEnvironment = this.getSpawnEnv();
// On Windows, .cmd files require shell resolution. Since MCP SDK's
// StdioClientTransport doesn't support `shell: true`, route through
// cmd.exe which resolves .cmd/.bat extensions and PATH automatically.
// This also fixes Git Bash compatibility (#1062) since cmd.exe handles
// Windows-native command resolution regardless of the calling shell.
const isWindows = process.platform === 'win32';
const uvxCommand = isWindows ? 'uvx.cmd' : 'uvx';
const uvxSpawnCommand = isWindows ? (process.env.ComSpec || 'cmd.exe') : 'uvx';
const uvxSpawnArgs = isWindows ? ['/c', 'uvx', ...commandArgs] : commandArgs;
logger.info('CHROMA_MCP', 'Connecting to chroma-mcp via MCP stdio', {
command: uvxCommand,
args: commandArgs.join(' ')
command: uvxSpawnCommand,
args: uvxSpawnArgs.join(' ')
});
this.transport = new StdioClientTransport({
command: uvxCommand,
args: commandArgs,
command: uvxSpawnCommand,
args: uvxSpawnArgs,
env: spawnEnvironment,
stderr: 'pipe'
});
@@ -177,6 +183,7 @@ export class ChromaMcpManager {
private buildCommandArgs(): string[] {
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
const chromaMode = settings.CLAUDE_MEM_CHROMA_MODE || 'local';
const pythonVersion = process.env.CLAUDE_MEM_PYTHON_VERSION || settings.CLAUDE_MEM_PYTHON_VERSION || '3.13';
if (chromaMode === 'remote') {
const chromaHost = settings.CLAUDE_MEM_CHROMA_HOST || '127.0.0.1';
@@ -187,6 +194,7 @@ export class ChromaMcpManager {
const chromaApiKey = settings.CLAUDE_MEM_CHROMA_API_KEY || '';
const args = [
'--python', pythonVersion,
'chroma-mcp',
'--client-type', 'http',
'--host', chromaHost,
@@ -214,9 +222,10 @@ export class ChromaMcpManager {
// Local mode: persistent client with data directory
return [
'--python', pythonVersion,
'chroma-mcp',
'--client-type', 'persistent',
'--data-dir', DEFAULT_CHROMA_DATA_DIR
'--data-dir', DEFAULT_CHROMA_DATA_DIR.replace(/\\/g, '/')
];
}
@@ -235,10 +244,35 @@ export class ChromaMcpManager {
arguments: JSON.stringify(toolArguments).slice(0, 200)
});
const result = await this.client!.callTool({
name: toolName,
arguments: toolArguments
});
let result;
try {
result = await this.client!.callTool({
name: toolName,
arguments: toolArguments
});
} catch (transportError) {
// Transport error: chroma-mcp subprocess likely died (e.g., killed by orphan reaper,
// HNSW index corruption). Mark connection dead and retry once after reconnect (#1131).
// Without this retry, callers see a one-shot error even though reconnect would succeed.
this.connected = false;
this.client = null;
this.transport = null;
logger.warn('CHROMA_MCP', `Transport error during "${toolName}", reconnecting and retrying once`, {
error: transportError instanceof Error ? transportError.message : String(transportError)
});
try {
await this.ensureConnected();
result = await this.client!.callTool({
name: toolName,
arguments: toolArguments
});
} catch (retryError) {
this.connected = false;
throw new Error(`chroma-mcp transport error during "${toolName}" (retry failed): ${retryError instanceof Error ? retryError.message : String(retryError)}`);
}
}
// MCP tools signal errors via isError flag on the CallToolResult
if (result.isError) {
+22 -6
View File
@@ -267,12 +267,28 @@ export class ChromaSync {
for (let i = 0; i < documents.length; i += this.BATCH_SIZE) {
const batch = documents.slice(i, i + this.BATCH_SIZE);
await chromaMcp.callTool('chroma_add_documents', {
collection_name: this.collectionName,
ids: batch.map(d => d.id),
documents: batch.map(d => d.document),
metadatas: batch.map(d => d.metadata)
});
// Sanitize metadata: filter out null, undefined, and empty string values
// that chroma-mcp may reject (e.g., null subtitle from raw SQLite rows)
const cleanMetadatas = batch.map(d =>
Object.fromEntries(
Object.entries(d.metadata).filter(([_, v]) => v !== null && v !== undefined && v !== '')
)
);
try {
await chromaMcp.callTool('chroma_add_documents', {
collection_name: this.collectionName,
ids: batch.map(d => d.id),
documents: batch.map(d => d.document),
metadatas: cleanMetadatas
});
} catch (error) {
logger.error('CHROMA_SYNC', 'Batch add failed, continuing with remaining batches', {
collection: this.collectionName,
batchStart: i,
batchSize: batch.length
}, error as Error);
}
}
logger.debug('CHROMA_SYNC', 'Documents added', {
+46 -4
View File
@@ -59,6 +59,10 @@ function clearWorkerSpawnAttempted(): void {
}
}
// Re-export for backward compatibility — canonical implementation in shared/plugin-state.ts
export { isPluginDisabledInClaudeSettings } from '../shared/plugin-state.js';
import { isPluginDisabledInClaudeSettings } from '../shared/plugin-state.js';
// Version injected at build time by esbuild define
declare const __DEFAULT_PACKAGE_VERSION__: string;
const packageVersion = typeof __DEFAULT_PACKAGE_VERSION__ !== 'undefined' ? __DEFAULT_PACKAGE_VERSION__ : '0.0.0-dev';
@@ -74,7 +78,9 @@ import {
cleanStalePidFile,
isProcessAlive,
spawnDaemon,
createSignalHandler
createSignalHandler,
isPidFileRecent,
touchPidFile
} from './infrastructure/ProcessManager.js';
import {
isPortInUse,
@@ -385,9 +391,14 @@ export class WorkerService {
runOneTimeChromaMigration();
}
// Initialize ChromaMcpManager (lazy - connects on first use via ChromaSync)
this.chromaMcpManager = ChromaMcpManager.getInstance();
logger.info('SYSTEM', 'ChromaMcpManager initialized (lazy - connects on first use)');
// Initialize ChromaMcpManager only if Chroma is enabled
const chromaEnabled = settings.CLAUDE_MEM_CHROMA_ENABLED !== 'false';
if (chromaEnabled) {
this.chromaMcpManager = ChromaMcpManager.getInstance();
logger.info('SYSTEM', 'ChromaMcpManager initialized (lazy - connects on first use)');
} else {
logger.info('SYSTEM', 'Chroma disabled via CLAUDE_MEM_CHROMA_ENABLED=false, skipping ChromaMcpManager');
}
const modeId = settings.CLAUDE_MEM_MODE;
ModeManager.getInstance().loadMode(modeId);
@@ -535,6 +546,9 @@ export class WorkerService {
logger.info('SYSTEM', `Starting generator (${source}) using ${providerName}`, { sessionId: sid });
// Track generator activity for stale detection (Issue #1099)
session.lastGeneratorActivity = Date.now();
session.generatorPromise = agent.startSession(session, this)
.catch(async (error: unknown) => {
const errorMessage = (error as Error)?.message || '';
@@ -893,6 +907,23 @@ async function ensureWorkerStarted(port: number): Promise<boolean> {
if (await waitForHealth(port, 1000)) {
const versionCheck = await checkVersionMatch(port);
if (!versionCheck.matches) {
// Guard: If PID file was written recently, another session is likely already
// restarting the worker. Poll health instead of starting a concurrent restart.
// This prevents the "100 sessions all restart simultaneously" storm (#1145).
const RESTART_COORDINATION_THRESHOLD_MS = 15000;
if (isPidFileRecent(RESTART_COORDINATION_THRESHOLD_MS)) {
logger.info('SYSTEM', 'Version mismatch detected but PID file is recent — another restart likely in progress, polling health', {
pluginVersion: versionCheck.pluginVersion,
workerVersion: versionCheck.workerVersion
});
const healthy = await waitForHealth(port, RESTART_COORDINATION_THRESHOLD_MS);
if (healthy) {
logger.info('SYSTEM', 'Worker became healthy after waiting for concurrent restart');
return true;
}
logger.warn('SYSTEM', 'Worker did not become healthy after waiting — proceeding with own restart');
}
logger.info('SYSTEM', 'Worker version mismatch detected - auto-restarting', {
pluginVersion: versionCheck.pluginVersion,
workerVersion: versionCheck.workerVersion
@@ -957,6 +988,9 @@ async function ensureWorkerStarted(port: number): Promise<boolean> {
}
clearWorkerSpawnAttempted();
// Touch PID file to signal other sessions that a restart just completed.
// Other sessions checking isPidFileRecent() will see this and skip their own restart.
touchPidFile();
logger.info('SYSTEM', 'Worker started successfully');
return true;
}
@@ -967,6 +1001,14 @@ async function ensureWorkerStarted(port: number): Promise<boolean> {
async function main() {
const command = process.argv[2];
// Early exit if plugin is disabled in Claude Code settings (#781).
// Only gate hook-initiated commands; CLI management (stop/status) still works.
const hookInitiatedCommands = ['start', 'hook', 'restart', '--daemon'];
if ((hookInitiatedCommands.includes(command) || command === undefined) && isPluginDisabledInClaudeSettings()) {
process.exit(0);
}
const port = getWorkerPort();
// Helper for JSON status output in 'start' command
+1
View File
@@ -36,6 +36,7 @@ export interface ActiveSession {
consecutiveRestarts: number; // Track consecutive restart attempts to prevent infinite loops
forceInit?: boolean; // Force fresh SDK session (skip resume)
idleTimedOut?: boolean; // Set when session exits due to idle timeout (prevents restart loop)
lastGeneratorActivity: number; // Timestamp of last generator progress (for stale detection, Issue #1099)
// CLAIM-CONFIRM FIX: Track IDs of messages currently being processed
// These IDs will be confirmed (deleted) after successful storage
processingMessageIds: number[];
+12 -7
View File
@@ -11,6 +11,8 @@
import { SessionStore } from '../sqlite/SessionStore.js';
import { SessionSearch } from '../sqlite/SessionSearch.js';
import { ChromaSync } from '../sync/ChromaSync.js';
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
import { USER_SETTINGS_PATH } from '../../shared/paths.js';
import { logger } from '../../utils/logger.js';
import type { DBSession } from '../worker-types.js';
@@ -27,8 +29,14 @@ export class DatabaseManager {
this.sessionStore = new SessionStore();
this.sessionSearch = new SessionSearch();
// Initialize ChromaSync (lazy - connects on first search, not at startup)
this.chromaSync = new ChromaSync('claude-mem');
// Initialize ChromaSync only if Chroma is enabled (SQLite-only fallback when disabled)
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
const chromaEnabled = settings.CLAUDE_MEM_CHROMA_ENABLED !== 'false';
if (chromaEnabled) {
this.chromaSync = new ChromaSync('claude-mem');
} else {
logger.info('DB', 'Chroma disabled via CLAUDE_MEM_CHROMA_ENABLED=false, using SQLite-only search');
}
logger.info('DB', 'Database initialized');
}
@@ -75,12 +83,9 @@ export class DatabaseManager {
}
/**
* Get ChromaSync instance (throws if not initialized)
* Get ChromaSync instance (returns null if Chroma is disabled)
*/
getChromaSync(): ChromaSync {
if (!this.chromaSync) {
throw new Error('ChromaSync not initialized');
}
getChromaSync(): ChromaSync | null {
return this.chromaSync;
}
+1 -1
View File
@@ -39,7 +39,7 @@ export class SearchManager {
constructor(
private sessionSearch: SessionSearch,
private sessionStore: SessionStore,
private chromaSync: ChromaSync,
private chromaSync: ChromaSync | null,
private formatter: FormattingService,
private timelineService: TimelineService
) {
+13 -3
View File
@@ -155,7 +155,8 @@ export class SessionManager {
conversationHistory: [], // Initialize empty - will be populated by agents
currentProvider: null, // Will be set when generator starts
consecutiveRestarts: 0, // Track consecutive restart attempts to prevent infinite loops
processingMessageIds: [] // CLAIM-CONFIRM: Track message IDs for confirmProcessed()
processingMessageIds: [], // CLAIM-CONFIRM: Track message IDs for confirmProcessed()
lastGeneratorActivity: Date.now() // Initialize for stale detection (Issue #1099)
};
logger.debug('SESSION', 'Creating new session object (memorySessionId cleared to prevent stale resume)', {
@@ -286,11 +287,17 @@ export class SessionManager {
// 1. Abort the SDK agent
session.abortController.abort();
// 2. Wait for generator to finish
// 2. Wait for generator to finish (with 30s timeout to prevent stale stall, Issue #1099)
if (session.generatorPromise) {
await session.generatorPromise.catch(() => {
const generatorDone = session.generatorPromise.catch(() => {
logger.debug('SYSTEM', 'Generator already failed, cleaning up', { sessionId: session.sessionDbId });
});
const timeoutDone = new Promise<void>(resolve => {
AbortSignal.timeout(30_000).addEventListener('abort', () => resolve(), { once: true });
});
await Promise.race([generatorDone, timeoutDone]).then(() => {}, () => {
logger.warn('SESSION', 'Generator did not exit within 30s after abort, forcing cleanup (#1099)', { sessionDbId });
});
}
// 3. Verify subprocess exit with 5s timeout (Issue #737 fix)
@@ -468,6 +475,9 @@ export class SessionManager {
session.earliestPendingTimestamp = Math.min(session.earliestPendingTimestamp, message._originalTimestamp);
}
// Update generator activity for stale detection (Issue #1099)
session.lastGeneratorActivity = Date.now();
yield message;
}
}
@@ -56,6 +56,9 @@ export async function processAgentResponse(
agentName: string,
projectRoot?: string
): Promise<void> {
// Track generator activity for stale detection (Issue #1099)
session.lastGeneratorActivity = Date.now();
// Add assistant response to shared conversation history for provider interop
if (text) {
session.conversationHistory.push({ role: 'assistant', content: text });
@@ -189,8 +192,8 @@ async function syncAndBroadcastObservations(
const obs = observations[i];
const chromaStart = Date.now();
// Sync to Chroma (fire-and-forget)
dbManager.getChromaSync().syncObservation(
// Sync to Chroma (fire-and-forget, skipped if Chroma is disabled)
dbManager.getChromaSync()?.syncObservation(
obsId,
session.contentSessionId,
session.project,
@@ -282,8 +285,8 @@ async function syncAndBroadcastSummary(
const chromaStart = Date.now();
// Sync to Chroma (fire-and-forget)
dbManager.getChromaSync().syncSummary(
// Sync to Chroma (fire-and-forget, skipped if Chroma is disabled)
dbManager.getChromaSync()?.syncSummary(
result.summaryId,
session.contentSessionId,
session.project,
+2
View File
@@ -37,6 +37,8 @@ export function createMiddleware(
callback(new Error('CORS not allowed'));
}
},
methods: ['GET', 'HEAD', 'POST', 'PUT', 'PATCH', 'DELETE'],
allowedHeaders: ['Content-Type', 'Authorization', 'X-Requested-With'],
credentials: false
}));
+78 -9
View File
@@ -5,12 +5,85 @@
*/
import express, { Request, Response } from 'express';
import { readFileSync, existsSync, writeFileSync, readdirSync } from 'fs';
import { openSync, fstatSync, readSync, closeSync, existsSync, writeFileSync } from 'fs';
import { join } from 'path';
import { logger } from '../../../../utils/logger.js';
import { SettingsDefaultsManager } from '../../../../shared/SettingsDefaultsManager.js';
import { BaseRouteHandler } from '../BaseRouteHandler.js';
/**
* Read the last N lines from a file without loading the entire file into memory.
* Reads backwards from the end of the file in chunks until enough lines are found.
*/
export function readLastLines(filePath: string, lineCount: number): { lines: string; totalEstimate: number } {
const fd = openSync(filePath, 'r');
try {
const stat = fstatSync(fd);
const fileSize = stat.size;
if (fileSize === 0) {
return { lines: '', totalEstimate: 0 };
}
// Start with a reasonable chunk size, expand if needed
const INITIAL_CHUNK_SIZE = 64 * 1024; // 64KB
const MAX_READ_SIZE = 10 * 1024 * 1024; // 10MB cap to prevent OOM on huge single-line files
let readSize = Math.min(INITIAL_CHUNK_SIZE, fileSize);
let content = '';
let newlineCount = 0;
while (readSize <= fileSize && readSize <= MAX_READ_SIZE) {
const startPosition = Math.max(0, fileSize - readSize);
const bytesToRead = fileSize - startPosition;
const buffer = Buffer.alloc(bytesToRead);
readSync(fd, buffer, 0, bytesToRead, startPosition);
content = buffer.toString('utf-8');
// Count newlines to see if we have enough
newlineCount = 0;
for (let i = 0; i < content.length; i++) {
if (content[i] === '\n') newlineCount++;
}
// We need lineCount newlines to get lineCount full lines (trailing newline)
if (newlineCount >= lineCount || startPosition === 0) {
break;
}
// Double the read size for next attempt
readSize = Math.min(readSize * 2, fileSize, MAX_READ_SIZE);
}
// Split and take the last N lines
const allLines = content.split('\n');
// Remove trailing empty element from final newline
if (allLines.length > 0 && allLines[allLines.length - 1] === '') {
allLines.pop();
}
const startIndex = Math.max(0, allLines.length - lineCount);
const resultLines = allLines.slice(startIndex);
// Estimate total lines: if we read the whole file, we know exactly; otherwise estimate
let totalEstimate: number;
if (fileSize <= readSize) {
totalEstimate = allLines.length;
} else {
// Rough estimate based on average line length in the chunk we read
const avgLineLength = content.length / Math.max(newlineCount, 1);
totalEstimate = Math.round(fileSize / avgLineLength);
}
return {
lines: resultLines.join('\n'),
totalEstimate,
};
} finally {
closeSync(fd);
}
}
export class LogsRoutes extends BaseRouteHandler {
private getLogFilePath(): string {
const dataDir = SettingsDefaultsManager.get('CLAUDE_MEM_DATA_DIR');
@@ -50,19 +123,15 @@ export class LogsRoutes extends BaseRouteHandler {
const requestedLines = parseInt(req.query.lines as string || '1000', 10);
const maxLines = Math.min(requestedLines, 10000); // Cap at 10k lines
const content = readFileSync(logFilePath, 'utf-8');
const lines = content.split('\n');
// Return the last N lines
const startIndex = Math.max(0, lines.length - maxLines);
const recentLines = lines.slice(startIndex).join('\n');
const { lines: recentLines, totalEstimate } = readLastLines(logFilePath, maxLines);
const returnedLines = recentLines === '' ? 0 : recentLines.split('\n').length;
res.json({
logs: recentLines,
path: logFilePath,
exists: true,
totalLines: lines.length,
returnedLines: lines.length - startIndex
totalLines: totalEstimate,
returnedLines,
});
});
@@ -90,6 +90,8 @@ export class SessionRoutes extends BaseRouteHandler {
* we let the current generator finish naturally (max 5s linger timeout).
* The next generator will use the new provider with shared conversationHistory.
*/
private static readonly STALE_GENERATOR_THRESHOLD_MS = 30_000; // 30 seconds (#1099)
private ensureGeneratorRunning(sessionDbId: number, source: string): void {
const session = this.sessionManager.getSession(sessionDbId);
if (!session) return;
@@ -109,6 +111,26 @@ export class SessionRoutes extends BaseRouteHandler {
return;
}
// Generator is running - check if stale (no activity for 30s) to prevent queue stall (#1099)
const timeSinceActivity = Date.now() - session.lastGeneratorActivity;
if (timeSinceActivity > SessionRoutes.STALE_GENERATOR_THRESHOLD_MS) {
logger.warn('SESSION', 'Stale generator detected, aborting to prevent queue stall (#1099)', {
sessionId: sessionDbId,
timeSinceActivityMs: timeSinceActivity,
thresholdMs: SessionRoutes.STALE_GENERATOR_THRESHOLD_MS,
source
});
// Abort the stale generator and reset state
session.abortController.abort();
session.generatorPromise = null;
session.abortController = new AbortController();
session.lastGeneratorActivity = Date.now();
// Start a fresh generator
this.spawnInProgress.set(sessionDbId, true);
this.startGeneratorWithProvider(session, selectedProvider, 'stale-recovery');
return;
}
// Generator is running - check if provider changed
if (session.currentProvider && session.currentProvider !== selectedProvider) {
logger.info('SESSION', `Provider changed, will switch after current generator finishes`, {
@@ -155,8 +177,9 @@ export class SessionRoutes extends BaseRouteHandler {
historyLength: session.conversationHistory.length
});
// Track which provider is running
// Track which provider is running and mark activity for stale detection (#1099)
session.currentProvider = provider;
session.lastGeneratorActivity = Date.now();
session.generatorPromise = agent.startSession(session, this.workerService)
.catch(error => {
@@ -669,23 +692,30 @@ export class SessionRoutes extends BaseRouteHandler {
* Returns: { sessionDbId, promptNumber, skipped: boolean, reason?: string }
*/
private handleSessionInitByClaudeId = this.wrapHandler((req: Request, res: Response): void => {
const { contentSessionId, project, prompt } = req.body;
const { contentSessionId } = req.body;
// Only contentSessionId is truly required — Cursor and other platforms
// may omit prompt/project in their payload (#838, #1049)
const project = req.body.project || 'unknown';
const prompt = req.body.prompt || '[media prompt]';
const customTitle = req.body.customTitle || undefined;
logger.info('HTTP', 'SessionRoutes: handleSessionInitByClaudeId called', {
contentSessionId,
project,
prompt_length: prompt?.length
prompt_length: prompt?.length,
customTitle
});
// Validate required parameters
if (!this.validateRequired(req, res, ['contentSessionId', 'project', 'prompt'])) {
if (!this.validateRequired(req, res, ['contentSessionId'])) {
return;
}
const store = this.dbManager.getSessionStore();
// Step 1: Create/get SDK session (idempotent INSERT OR IGNORE)
const sessionDbId = store.createSDKSession(contentSessionId, project, prompt);
const sessionDbId = store.createSDKSession(contentSessionId, project, prompt, customTitle);
// Verify session creation with DB lookup
const dbSession = store.getSessionById(sessionDbId);
@@ -729,16 +759,22 @@ export class SessionRoutes extends BaseRouteHandler {
// Step 5: Save cleaned user prompt
store.saveUserPrompt(contentSessionId, promptNumber, cleanedPrompt);
// Step 6: Check if SDK agent is already running for this session (#1079)
// If contextInjected is true, the hook should skip re-initializing the SDK agent
const contextInjected = this.sessionManager.getSession(sessionDbId) !== undefined;
// Debug-level log since CREATED already logged the key info
logger.debug('SESSION', 'User prompt saved', {
sessionId: sessionDbId,
promptNumber
promptNumber,
contextInjected
});
res.json({
sessionDbId,
promptNumber,
skipped: false
skipped: false,
contextInjected
});
});
}
+2
View File
@@ -58,6 +58,7 @@ export interface SettingsDefaults {
CLAUDE_MEM_EXCLUDED_PROJECTS: string; // Comma-separated glob patterns for excluded project paths
CLAUDE_MEM_FOLDER_MD_EXCLUDE: string; // JSON array of folder paths to exclude from CLAUDE.md generation
// Chroma Vector Database Configuration
CLAUDE_MEM_CHROMA_ENABLED: string; // 'true' | 'false' - set to 'false' for SQLite-only mode
CLAUDE_MEM_CHROMA_MODE: string; // 'local' | 'remote'
CLAUDE_MEM_CHROMA_HOST: string;
CLAUDE_MEM_CHROMA_PORT: string;
@@ -118,6 +119,7 @@ export class SettingsDefaultsManager {
CLAUDE_MEM_EXCLUDED_PROJECTS: '', // Comma-separated glob patterns for excluded project paths
CLAUDE_MEM_FOLDER_MD_EXCLUDE: '[]', // JSON array of folder paths to exclude from CLAUDE.md generation
// Chroma Vector Database Configuration
CLAUDE_MEM_CHROMA_ENABLED: 'true', // Set to 'false' to disable Chroma and use SQLite-only search
CLAUDE_MEM_CHROMA_MODE: 'local', // 'local' uses persistent chroma-mcp via uvx, 'remote' connects to existing server
CLAUDE_MEM_CHROMA_HOST: '127.0.0.1',
CLAUDE_MEM_CHROMA_PORT: '8000',
+6 -3
View File
@@ -99,7 +99,9 @@ export function ensureAllClaudeDirs(): void {
}
/**
* Get current project name from git root or cwd
* Get current project name from git root or cwd.
* Includes parent directory to avoid collisions when repos share a folder name
* (e.g., ~/work/monorepo "work/monorepo" vs ~/personal/monorepo "personal/monorepo").
*/
export function getCurrentProjectName(): string {
try {
@@ -109,12 +111,13 @@ export function getCurrentProjectName(): string {
stdio: ['pipe', 'pipe', 'ignore'],
windowsHide: true
}).trim();
return basename(gitRoot);
return basename(dirname(gitRoot)) + '/' + basename(gitRoot);
} catch (error) {
logger.debug('SYSTEM', 'Git root detection failed, using cwd basename', {
cwd: process.cwd()
}, error as Error);
return basename(process.cwd());
const cwd = process.cwd();
return basename(dirname(cwd)) + '/' + basename(cwd);
}
}
+29
View File
@@ -0,0 +1,29 @@
/**
* Plugin state utilities for checking Claude Code's plugin settings.
* Kept minimal no heavy dependencies so hooks can check quickly.
*/
import { existsSync, readFileSync } from 'fs';
import { join } from 'path';
import { homedir } from 'os';
const PLUGIN_SETTINGS_KEY = 'claude-mem@thedotmack';
/**
* Check if claude-mem is disabled in Claude Code's settings (#781).
* Sync read + JSON parse for speed called before any async work.
* Returns true only if the plugin is explicitly disabled (enabledPlugins[key] === false).
*/
export function isPluginDisabledInClaudeSettings(): boolean {
try {
const claudeConfigDir = process.env.CLAUDE_CONFIG_DIR || join(homedir(), '.claude');
const settingsPath = join(claudeConfigDir, 'settings.json');
if (!existsSync(settingsPath)) return false;
const raw = readFileSync(settingsPath, 'utf-8');
const settings = JSON.parse(raw);
return settings?.enabledPlugins?.[PLUGIN_SETTINGS_KEY] === false;
} catch {
// If settings can't be read/parsed, assume not disabled
return false;
}
}
+5 -1
View File
@@ -1,5 +1,5 @@
import { existsSync, readFileSync, writeFileSync, renameSync, mkdirSync } from 'fs';
import { dirname } from 'path';
import { dirname, resolve } from 'path';
import { replaceTaggedContent } from './claude-md-utils.js';
import { logger } from './logger.js';
@@ -10,6 +10,10 @@ import { logger } from './logger.js';
export function writeAgentsMd(agentsPath: string, context: string): void {
if (!agentsPath) return;
// Never write inside .git directories — corrupts refs (#1165)
const resolvedPath = resolve(agentsPath);
if (resolvedPath.includes('/.git/') || resolvedPath.includes('\\.git\\') || resolvedPath.endsWith('/.git') || resolvedPath.endsWith('\\.git')) return;
const dir = dirname(agentsPath);
if (!existsSync(dir)) {
mkdirSync(dir, { recursive: true });
+5
View File
@@ -114,6 +114,11 @@ export function replaceTaggedContent(existingContent: string, newContent: string
* @param newContent - Content to write inside tags
*/
export function writeClaudeMdToFolder(folderPath: string, newContent: string): void {
const resolvedPath = path.resolve(folderPath);
// Never write inside .git directories — corrupts refs (#1165)
if (resolvedPath.includes('/.git/') || resolvedPath.includes('\\.git\\') || resolvedPath.endsWith('/.git') || resolvedPath.endsWith('\\.git')) return;
const claudeMdPath = path.join(folderPath, 'CLAUDE.md');
const tempFile = `${claudeMdPath}.tmp`;