feat: Mode system with inheritance and multilingual support (#412)
* feat: add domain management system with support for multiple domain profiles
- Introduced DomainManager class for loading and managing domain profiles.
- Added support for a default domain ('code') and fallback mechanisms.
- Implemented domain configuration validation and error handling.
- Created types for domain configuration, observation types, and concepts.
- Added new directory for domain profiles and ensured its existence.
- Updated SettingsDefaultsManager to include CLAUDE_MEM_DOMAIN setting.
* Refactor domain management to mode management
- Removed DomainManager class and replaced it with ModeManager for better clarity and functionality.
- Updated types from DomainConfig to ModeConfig and DomainPrompts to ModePrompts.
- Changed references from domains to modes in the settings and paths.
- Ensured backward compatibility by maintaining the fallback mechanism to the 'code' mode.
* feat: add migration 008 to support mode-agnostic observations and refactor service layer references in documentation
* feat: add new modes for code development and email investigation with detailed observation types and concepts
* Refactor observation parsing and prompt generation to incorporate mode-specific configurations
- Updated `parseObservations` function to use 'observation' as a universal fallback type instead of 'change', utilizing active mode's valid observation types.
- Modified `buildInitPrompt` and `buildContinuationPrompt` functions to accept a `ModeConfig` parameter, allowing for dynamic prompt content based on the active mode.
- Enhanced `ModePrompts` interface to include additional guidance for observers, such as recording focus and skip guidance.
- Adjusted the SDKAgent to load the active mode and pass it to prompt generation functions, ensuring prompts are tailored to the current mode's context.
* fix: correct mode prompt injection to preserve exact wording and type list visibility
- Add script to extract prompts from main branch prompts.ts into code.yaml
- Fix prompts.ts to show type list in XML template (e.g., "[ bugfix | feature | ... ]")
- Keep 'change' as fallback type in parser.ts (maintain backwards compatibility)
- Regenerate code.yaml with exact wording from original hardcoded prompts
- Build succeeds with no TypeScript errors
* fix: update ModeManager to load JSON mode files and improve validation
- Changed ModeManager to load mode configurations from JSON files instead of YAML.
- Removed the requirement for an "observation" type and updated validation to require at least one observation type.
- Updated fallback behavior in the parser to use the first type from the active mode's type list.
- Added comprehensive tests for mode loading, prompt injection, and parser integration, ensuring correct behavior across different modes.
- Introduced new mode JSON files for "Code Development" and "Email Investigation" with detailed observation types and prompts.
* Add mode configuration loading and update licensing information for Ragtime
- Implemented loading of mode configuration in WorkerService before database initialization.
- Added PolyForm Noncommercial License 1.0.0 to Ragtime directory.
- Created README.md for Ragtime with licensing details and usage guidelines.
* fix: add datasets directory to .gitignore to prevent accidental commits
* refactor: remove unused plugin package.json file
* chore: add package.json for claude-mem plugin with version 7.4.5
* refactor: remove outdated tests and improve error handling
- Deleted tests for ChromaSync error handling, smart install, strip memory tags, and user prompt tag stripping due to redundancy or outdated logic.
- Removed vitest configuration as it is no longer needed.
- Added a comprehensive implementation plan for fixing the modes system, addressing critical issues and improving functionality.
- Created a detailed test analysis report highlighting the quality and effectiveness of the current test suite, identifying areas for improvement.
- Introduced a new plugin package.json for runtime dependencies related to claude-mem hooks.
* refactor: remove parser regression tests to streamline codebase
* docs: update CLAUDE.md to clarify test management and changelog generation
* refactor: remove migration008 for mode-agnostic observations
* Refactor observation type handling to use ModeManager for icons and emojis
- Removed direct mappings of observation types to icons and work emojis in context-generator, FormattingService, SearchManager, and TimelineService.
- Integrated ModeManager to dynamically retrieve icons and emojis based on the active mode.
- Improved maintainability by centralizing the logic for observation type representation.
* Refactor observation metadata constants and update context generator
- Removed the explicit declaration of OBSERVATION_TYPES and OBSERVATION_CONCEPTS from observation-metadata.ts.
- Introduced fallback default strings for DEFAULT_OBSERVATION_TYPES_STRING and DEFAULT_OBSERVATION_CONCEPTS_STRING.
- Updated context-generator.ts to utilize observation types and concepts from ModeManager instead of constants.
* refactor: remove intermediate error handling from hooks (Phase 1)
Apply "fail fast" error handling strategy - errors propagate and crash loud
instead of being caught, wrapped, and re-thrown at intermediate layers.
Changes:
- Remove try/catch around fetch calls in all hooks - let errors throw
- Add try/catch ONLY around JSON.parse at entry points
- Delete error-handler.ts and hook-error-handler.ts (no longer needed)
- Update worker-utils.ts: functions now throw instead of returning null
- Update transcript-parser.ts: throw on missing path, empty file, malformed JSON
- Remove all handleWorkerError, handleFetchError imports
Philosophy: If something breaks, we KNOW it broke. No silent failures.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* refactor: remove intermediate error handling from worker service (Phase 2)
Apply "fail fast" error handling strategy to worker service layer.
Changes:
- worker-service.ts: Remove try/catch from version endpoint, cleanup,
MCP close, process enumeration, force kill, and isAlive check
- SessionRoutes.ts: Remove try/catch from JSON.stringify calls, remove
.catch() from Chroma sync and SDK agent calls
- SettingsRoutes.ts: Remove try/catch from toggleMcp()
- DatabaseManager.ts: Remove .catch() from backfill and close operations
- SDKAgent.ts: Keep outer try/catch (top-level), remove .catch() from
Chroma sync operations
- SSEBroadcaster.ts: Remove try/catch from broadcast and sendToClient
Philosophy: Errors propagate and crash loud. BaseRouteHandler.wrapHandler
provides top-level catching for HTTP routes.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* refactor: remove error swallowing from SQLite services (Phase 3)
Apply "fail fast" error handling strategy to database layer.
SessionStore.ts migrations:
- ensureWorkerPortColumn(): Remove outer try/catch, let it throw
- ensurePromptTrackingColumns(): Remove outer try/catch, let it throw
- removeSessionSummariesUniqueConstraint(): Keep inner transaction
rollback, remove outer catch
- addObservationHierarchicalFields(): Remove outer try/catch
- makeObservationsTextNullable(): Keep inner transaction rollback,
remove outer catch
- createUserPromptsTable(): Keep inner transaction rollback, remove
outer catch
- getFilesForSession(): Remove try/catch around JSON.parse
SessionSearch.ts:
- ensureFTSTables(): Remove try/catch, let it throw
Philosophy: Migration errors that are swallowed mean we think the
database is fine when it's not. Keep only inner transaction rollback
try/catch blocks.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* refactor: remove error hiding from utilities (Phase 4)
Apply "fail fast" error handling strategy to utility layer.
logger.ts:
- formatTool(): Remove try/catch, let JSON.parse throw on malformed input
context-generator.ts:
- loadContextConfig(): Remove try/catch, let parseInt throw on invalid settings
- Transcript extraction: Remove try/catch, let file read errors propagate
ChromaSync.ts:
- close(): Remove nested try/catch blocks, let close errors propagate
Philosophy: No silent fallbacks or hidden defaults. If something breaks,
we know it broke immediately.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* feat: serve static UI assets and update package root path
- Added middleware to serve static UI assets (JS, CSS, fonts, etc.) in ViewerRoutes.
- Updated getPackageRoot function to correctly return the package root directory as one level up from the current directory.
* feat: Enhance mode loading with inheritance support
- Introduced parseInheritance method to handle parent--override mode IDs.
- Added deepMerge method for recursively merging mode configurations.
- Updated loadMode method to support inheritance, loading parent modes and applying overrides.
- Improved error handling for missing mode files and logging for better traceability.
* fix(modes): correct inheritance file resolution and path handling
* Refactor code structure for improved readability and maintainability
* feat: Add mode configuration documentation and examples
* fix: Improve concurrency handling in translateReadme function
* Refactor SDK prompts to enhance clarity and structure
- Updated the `buildInitPrompt` and `buildContinuationPrompt` functions in `prompts.ts` to improve the organization of prompt components, including the addition of language instructions and footer messages.
- Removed redundant instructions and emphasized the importance of recording observations.
- Modified the `ModePrompts` interface in `types.ts` to include new properties for system identity, language instructions, and output format header, ensuring better flexibility and clarity in prompt generation.
* Enhance prompts with language instructions and XML formatting
- Updated `buildInitPrompt`, `buildSummaryPrompt`, and `buildContinuationPrompt` functions to include detailed language instructions in XML comments.
- Ensured that language instructions guide users to keep XML tags in English while writing content in the specified language.
- Modified the `buildSummaryPrompt` function to accept `mode` as a parameter for consistency.
- Adjusted the call to `buildSummaryPrompt` in `SDKAgent` to pass the `mode` argument.
* Refactor XML prompt generation in SDK
- Updated the buildInitPrompt, buildSummaryPrompt, and buildContinuationPrompt functions to use new placeholders for XML elements, improving maintainability and readability.
- Removed redundant language instructions in comments for clarity.
- Added new properties to ModePrompts interface for better structure and organization of XML placeholders and section headers.
* feat: Update observation prompts and structure across multiple languages
* chore: Remove planning docs and update Ragtime README
Remove ephemeral development artifacts:
- .claude/plans/modes-system-fixes.md
- .claude/test-analysis-report.md
- PROMPT_INJECTION_ANALYSIS.md
Update ragtime/README.md to explain:
- Feature is not yet implemented
- Dependency on modes system (now complete in PR #412)
- Ready to be scripted out in future release
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
* fix: Move summary prompts to mode files for multilingual support
Summary prompts were hardcoded in English in prompts.ts, breaking
multilingual support. Now properly mode-based:
- Added summary_instruction, summary_context_label,
summary_format_instruction, summary_footer to code.json
- Updated buildSummaryPrompt() to use mode fields instead of hardcoded text
- Added summary_footer with language instructions to all 10 language modes
- Language modes keep English prompts + language requirement footer
This fixes the gaslighting where we claimed full multilingual support
but summaries were still generated in English.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
* chore: Clean up README by removing local preview instructions and streamlining beta features section
* Add translated README files for Ukrainian, Vietnamese, and Chinese languages
* Add new language modes for code development in multiple languages
- Introduced JSON configurations for Code Development in Greek, Finnish, Hebrew, Hindi, Hungarian, Indonesian, Italian, Dutch, Norwegian, Polish, Brazilian Portuguese, Romanian, Swedish, Turkish, and Ukrainian.
- Each configuration includes prompts for observations, summaries, and instructions tailored to the respective language.
- Ensured that all prompts emphasize the importance of generating observations without referencing the agent's actions.
* Add multilingual support links to README files in various languages
- Updated README.id.md, README.it.md, README.ja.md, README.ko.md, README.nl.md, README.no.md, README.pl.md, README.pt-br.md, README.ro.md, README.ru.md, README.sv.md, README.th.md, README.tr.md, README.uk.md, README.vi.md, and README.zh.md to include links to other language versions.
- Each README now features a centered paragraph with flags and links for easy navigation between different language documents.
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -1,68 +1,19 @@
|
||||
/**
|
||||
* Observation metadata constants
|
||||
* Shared across hooks, worker service, and UI components
|
||||
*
|
||||
* Note: These are fallback defaults for the code mode.
|
||||
* Actual observation types and concepts are defined per-mode in the modes/ directory.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Valid observation types
|
||||
*/
|
||||
export const OBSERVATION_TYPES = [
|
||||
'bugfix',
|
||||
'feature',
|
||||
'refactor',
|
||||
'discovery',
|
||||
'decision',
|
||||
'change'
|
||||
] as const;
|
||||
|
||||
export type ObservationType = typeof OBSERVATION_TYPES[number];
|
||||
|
||||
/**
|
||||
* Valid observation concepts
|
||||
*/
|
||||
export const OBSERVATION_CONCEPTS = [
|
||||
'how-it-works',
|
||||
'why-it-exists',
|
||||
'what-changed',
|
||||
'problem-solution',
|
||||
'gotcha',
|
||||
'pattern',
|
||||
'trade-off'
|
||||
] as const;
|
||||
|
||||
export type ObservationConcept = typeof OBSERVATION_CONCEPTS[number];
|
||||
|
||||
/**
|
||||
* Map observation types to emoji icons
|
||||
*/
|
||||
export const TYPE_ICON_MAP: Record<ObservationType | 'session-request', string> = {
|
||||
'bugfix': '🔴',
|
||||
'feature': '🟣',
|
||||
'refactor': '🔄',
|
||||
'change': '✅',
|
||||
'discovery': '🔵',
|
||||
'decision': '⚖️',
|
||||
'session-request': '🎯'
|
||||
};
|
||||
|
||||
/**
|
||||
* Map observation types to work emoji (for token display)
|
||||
*/
|
||||
export const TYPE_WORK_EMOJI_MAP: Record<ObservationType, string> = {
|
||||
'discovery': '🔍', // research/exploration
|
||||
'change': '🛠️', // building/modifying
|
||||
'feature': '🛠️', // building/modifying
|
||||
'bugfix': '🛠️', // building/modifying
|
||||
'refactor': '🛠️', // building/modifying
|
||||
'decision': '⚖️' // decision-making
|
||||
};
|
||||
|
||||
/**
|
||||
* Default observation types (comma-separated string for settings)
|
||||
* Uses code mode defaults as fallback
|
||||
*/
|
||||
export const DEFAULT_OBSERVATION_TYPES_STRING = OBSERVATION_TYPES.join(',');
|
||||
export const DEFAULT_OBSERVATION_TYPES_STRING = 'bugfix,feature,refactor,discovery,decision,change';
|
||||
|
||||
/**
|
||||
* Default observation concepts (comma-separated string for settings)
|
||||
* Uses code mode defaults as fallback
|
||||
*/
|
||||
export const DEFAULT_OBSERVATION_CONCEPTS_STRING = OBSERVATION_CONCEPTS.join(',');
|
||||
export const DEFAULT_OBSERVATION_CONCEPTS_STRING = 'how-it-works,why-it-exists,what-changed,problem-solution,gotcha,pattern,trade-off';
|
||||
|
||||
+18
-20
@@ -30,26 +30,19 @@ async function cleanupHook(input?: SessionEndInput): Promise<void> {
|
||||
|
||||
const port = getWorkerPort();
|
||||
|
||||
try {
|
||||
// Send to worker - worker handles finding session, marking complete, and stopping spinner
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/complete`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
reason
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
// Send to worker - worker handles finding session, marking complete, and stopping spinner
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/complete`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
reason
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
// Non-fatal - session might not exist
|
||||
console.error('[cleanup-hook] Session not found or already cleaned up');
|
||||
}
|
||||
} catch (error: any) {
|
||||
// Worker might not be running - that's okay (non-critical)
|
||||
// But we should still log it for visibility
|
||||
console.error('[cleanup-hook] Failed to notify worker of session end:', error.message);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Session cleanup failed: ${response.status}`);
|
||||
}
|
||||
|
||||
console.log('{"continue": true, "suppressOutput": true}');
|
||||
@@ -64,7 +57,12 @@ if (stdin.isTTY) {
|
||||
let input = '';
|
||||
stdin.on('data', (chunk) => input += chunk);
|
||||
stdin.on('end', async () => {
|
||||
const parsed = input ? JSON.parse(input) : undefined;
|
||||
let parsed: SessionEndInput | undefined;
|
||||
try {
|
||||
parsed = input ? JSON.parse(input) : undefined;
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to parse hook input: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
await cleanupHook(parsed);
|
||||
});
|
||||
}
|
||||
|
||||
+12
-19
@@ -9,8 +9,6 @@
|
||||
import { stdin } from "process";
|
||||
import { ensureWorkerRunning, getWorkerPort } from "../shared/worker-utils.js";
|
||||
import { HOOK_TIMEOUTS } from "../shared/hook-constants.js";
|
||||
import { handleWorkerError } from "../shared/hook-error-handler.js";
|
||||
import { handleFetchError } from "./shared/error-handler.js";
|
||||
import { getProjectName } from "../utils/project-name.js";
|
||||
|
||||
export interface SessionStartInput {
|
||||
@@ -30,24 +28,14 @@ async function contextHook(input?: SessionStartInput): Promise<string> {
|
||||
|
||||
const url = `http://127.0.0.1:${port}/api/context/inject?project=${encodeURIComponent(project)}`;
|
||||
|
||||
try {
|
||||
const response = await fetch(url, { signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT) });
|
||||
const response = await fetch(url, { signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT) });
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
handleFetchError(response, errorText, {
|
||||
hookName: 'context',
|
||||
operation: 'Context generation',
|
||||
project,
|
||||
port
|
||||
});
|
||||
}
|
||||
|
||||
const result = await response.text();
|
||||
return result.trim();
|
||||
} catch (error: any) {
|
||||
handleWorkerError(error);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Context generation failed: ${response.status}`);
|
||||
}
|
||||
|
||||
const result = await response.text();
|
||||
return result.trim();
|
||||
}
|
||||
|
||||
// Entry Point - handle stdin/stdout
|
||||
@@ -62,7 +50,12 @@ if (stdin.isTTY || forceColors) {
|
||||
let input = "";
|
||||
stdin.on("data", (chunk) => (input += chunk));
|
||||
stdin.on("end", async () => {
|
||||
const parsed = input.trim() ? JSON.parse(input) : undefined;
|
||||
let parsed: SessionStartInput | undefined;
|
||||
try {
|
||||
parsed = input.trim() ? JSON.parse(input) : undefined;
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to parse hook input: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
const text = await contextHook(parsed);
|
||||
|
||||
console.log(
|
||||
|
||||
+40
-61
@@ -1,8 +1,6 @@
|
||||
import { stdin } from 'process';
|
||||
import { STANDARD_HOOK_RESPONSE } from './hook-response.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
import { handleWorkerError } from '../shared/hook-error-handler.js';
|
||||
import { handleFetchError } from './shared/error-handler.js';
|
||||
import { getProjectName } from '../utils/project-name.js';
|
||||
|
||||
export interface UserPromptSubmitInput {
|
||||
@@ -29,72 +27,48 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
const port = getWorkerPort();
|
||||
|
||||
// Initialize session via HTTP - handles DB operations and privacy checks
|
||||
let sessionDbId: number;
|
||||
let promptNumber: number;
|
||||
const initResponse = await fetch(`http://127.0.0.1:${port}/api/sessions/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
project,
|
||||
prompt
|
||||
}),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
|
||||
try {
|
||||
const initResponse = await fetch(`http://127.0.0.1:${port}/api/sessions/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
project,
|
||||
prompt
|
||||
}),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
|
||||
if (!initResponse.ok) {
|
||||
const errorText = await initResponse.text();
|
||||
handleFetchError(initResponse, errorText, {
|
||||
hookName: 'new',
|
||||
operation: 'Session initialization',
|
||||
project,
|
||||
port
|
||||
});
|
||||
}
|
||||
|
||||
const initResult = await initResponse.json();
|
||||
sessionDbId = initResult.sessionDbId;
|
||||
promptNumber = initResult.promptNumber;
|
||||
|
||||
// Check if prompt was entirely private (worker performs privacy check)
|
||||
if (initResult.skipped && initResult.reason === 'private') {
|
||||
console.error(`[new-hook] Session ${sessionDbId}, prompt #${promptNumber} (fully private - skipped)`);
|
||||
console.log(STANDARD_HOOK_RESPONSE);
|
||||
return;
|
||||
}
|
||||
|
||||
console.error(`[new-hook] Session ${sessionDbId}, prompt #${promptNumber}`);
|
||||
} catch (error: any) {
|
||||
handleWorkerError(error);
|
||||
if (!initResponse.ok) {
|
||||
throw new Error(`Session initialization failed: ${initResponse.status}`);
|
||||
}
|
||||
|
||||
const initResult = await initResponse.json();
|
||||
const sessionDbId = initResult.sessionDbId;
|
||||
const promptNumber = initResult.promptNumber;
|
||||
|
||||
// Check if prompt was entirely private (worker performs privacy check)
|
||||
if (initResult.skipped && initResult.reason === 'private') {
|
||||
console.error(`[new-hook] Session ${sessionDbId}, prompt #${promptNumber} (fully private - skipped)`);
|
||||
console.log(STANDARD_HOOK_RESPONSE);
|
||||
return;
|
||||
}
|
||||
|
||||
console.error(`[new-hook] Session ${sessionDbId}, prompt #${promptNumber}`);
|
||||
|
||||
// Strip leading slash from commands for memory agent
|
||||
// /review 101 → review 101 (more semantic for observations)
|
||||
const cleanedPrompt = prompt.startsWith('/') ? prompt.substring(1) : prompt;
|
||||
|
||||
try {
|
||||
// Initialize SDK agent session via HTTP (starts the agent!)
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ userPrompt: cleanedPrompt, promptNumber }),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
// Initialize SDK agent session via HTTP (starts the agent!)
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ userPrompt: cleanedPrompt, promptNumber }),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
handleFetchError(response, errorText, {
|
||||
hookName: 'new',
|
||||
operation: 'SDK agent start',
|
||||
project,
|
||||
port,
|
||||
sessionId: String(sessionDbId)
|
||||
});
|
||||
}
|
||||
} catch (error: any) {
|
||||
handleWorkerError(error);
|
||||
if (!response.ok) {
|
||||
throw new Error(`SDK agent start failed: ${response.status}`);
|
||||
}
|
||||
|
||||
console.log(STANDARD_HOOK_RESPONSE);
|
||||
@@ -104,6 +78,11 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
let input = '';
|
||||
stdin.on('data', (chunk) => input += chunk);
|
||||
stdin.on('end', async () => {
|
||||
const parsed = input ? JSON.parse(input) : undefined;
|
||||
let parsed: UserPromptSubmitInput | undefined;
|
||||
try {
|
||||
parsed = input ? JSON.parse(input) : undefined;
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to parse hook input: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
await newHook(parsed);
|
||||
});
|
||||
|
||||
+23
-31
@@ -11,8 +11,6 @@ import { STANDARD_HOOK_RESPONSE } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
import { HOOK_TIMEOUTS } from '../shared/hook-constants.js';
|
||||
import { handleWorkerError } from '../shared/hook-error-handler.js';
|
||||
import { handleFetchError } from './shared/error-handler.js';
|
||||
|
||||
export interface PostToolUseInput {
|
||||
session_id: string;
|
||||
@@ -48,37 +46,26 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
throw new Error(`Missing cwd in PostToolUse hook input for session ${session_id}, tool ${tool_name}`);
|
||||
}
|
||||
|
||||
try {
|
||||
// Send to worker - worker handles privacy check and database operations
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/observations`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
tool_name,
|
||||
tool_input,
|
||||
tool_response,
|
||||
cwd
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
// Send to worker - worker handles privacy check and database operations
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/observations`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
tool_name,
|
||||
tool_input,
|
||||
tool_response,
|
||||
cwd
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
handleFetchError(response, errorText, {
|
||||
hookName: 'save',
|
||||
operation: 'Observation storage',
|
||||
toolName: tool_name,
|
||||
sessionId: session_id,
|
||||
port
|
||||
});
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Observation sent successfully', { toolName: tool_name });
|
||||
} catch (error: any) {
|
||||
handleWorkerError(error);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Observation storage failed: ${response.status}`);
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Observation sent successfully', { toolName: tool_name });
|
||||
|
||||
console.log(STANDARD_HOOK_RESPONSE);
|
||||
}
|
||||
|
||||
@@ -86,6 +73,11 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
let input = '';
|
||||
stdin.on('data', (chunk) => input += chunk);
|
||||
stdin.on('end', async () => {
|
||||
const parsed = input ? JSON.parse(input) : undefined;
|
||||
let parsed: PostToolUseInput | undefined;
|
||||
try {
|
||||
parsed = input ? JSON.parse(input) : undefined;
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to parse hook input: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
await saveHook(parsed);
|
||||
});
|
||||
|
||||
@@ -1,37 +0,0 @@
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { getWorkerRestartInstructions } from '../../utils/error-messages.js';
|
||||
|
||||
export interface HookErrorContext {
|
||||
hookName: string;
|
||||
operation: string;
|
||||
project?: string;
|
||||
sessionId?: string;
|
||||
toolName?: string;
|
||||
port?: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Standardized error handler for hook fetch failures.
|
||||
*
|
||||
* This function:
|
||||
* 1. Logs the error with full context to worker logs
|
||||
* 2. Throws a user-facing error with restart instructions
|
||||
*
|
||||
* Use this for all fetch errors in hooks to ensure consistent error handling.
|
||||
*/
|
||||
export function handleFetchError(
|
||||
response: Response,
|
||||
errorText: string,
|
||||
context: HookErrorContext
|
||||
): never {
|
||||
logger.error('HOOK', `${context.operation} failed`, {
|
||||
status: response.status,
|
||||
...context
|
||||
}, errorText);
|
||||
|
||||
const userMessage = context.toolName
|
||||
? `Failed ${context.operation} for ${context.toolName}: ${getWorkerRestartInstructions()}`
|
||||
: `${context.operation} failed: ${getWorkerRestartInstructions()}`;
|
||||
|
||||
throw new Error(userMessage);
|
||||
}
|
||||
+20
-50
@@ -14,8 +14,6 @@ import { STANDARD_HOOK_RESPONSE } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
import { HOOK_TIMEOUTS } from '../shared/hook-constants.js';
|
||||
import { handleWorkerError } from '../shared/hook-error-handler.js';
|
||||
import { handleFetchError } from './shared/error-handler.js';
|
||||
import { extractLastMessage } from '../shared/transcript-parser.js';
|
||||
|
||||
export interface StopInput {
|
||||
@@ -54,56 +52,23 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
hasLastAssistantMessage: !!lastAssistantMessage
|
||||
});
|
||||
|
||||
let summaryError: Error | null = null;
|
||||
// Send to worker - worker handles privacy check and database operations
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/summarize`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
last_user_message: lastUserMessage,
|
||||
last_assistant_message: lastAssistantMessage
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
|
||||
try {
|
||||
// Send to worker - worker handles privacy check and database operations
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/sessions/summarize`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
last_user_message: lastUserMessage,
|
||||
last_assistant_message: lastAssistantMessage
|
||||
}),
|
||||
signal: AbortSignal.timeout(HOOK_TIMEOUTS.DEFAULT)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
handleFetchError(response, errorText, {
|
||||
hookName: 'summary',
|
||||
operation: 'Summary generation',
|
||||
sessionId: session_id,
|
||||
port
|
||||
});
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Summary request sent successfully');
|
||||
} catch (error: any) {
|
||||
summaryError = error;
|
||||
handleWorkerError(error);
|
||||
} finally {
|
||||
// Stop processing spinner (non-critical operation, errors are logged but don't block)
|
||||
try {
|
||||
const spinnerResponse = await fetch(`http://127.0.0.1:${port}/api/processing`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ isProcessing: false }),
|
||||
signal: AbortSignal.timeout(2000)
|
||||
});
|
||||
if (!spinnerResponse.ok) {
|
||||
logger.warn('HOOK', 'Failed to stop spinner', { status: spinnerResponse.status });
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.warn('HOOK', 'Could not stop spinner', { error: error.message });
|
||||
}
|
||||
if (!response.ok) {
|
||||
throw new Error(`Summary generation failed: ${response.status}`);
|
||||
}
|
||||
|
||||
// Re-throw summary error after cleanup to ensure it's not masked by finally block
|
||||
if (summaryError) {
|
||||
throw summaryError;
|
||||
}
|
||||
logger.debug('HOOK', 'Summary request sent successfully');
|
||||
|
||||
console.log(STANDARD_HOOK_RESPONSE);
|
||||
}
|
||||
@@ -112,6 +77,11 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
let input = '';
|
||||
stdin.on('data', (chunk) => input += chunk);
|
||||
stdin.on('end', async () => {
|
||||
const parsed = input ? JSON.parse(input) : undefined;
|
||||
let parsed: StopInput | undefined;
|
||||
try {
|
||||
parsed = input ? JSON.parse(input) : undefined;
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to parse hook input: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
await summaryHook(parsed);
|
||||
});
|
||||
|
||||
@@ -9,57 +9,32 @@
|
||||
import { basename } from "path";
|
||||
import { ensureWorkerRunning, getWorkerPort } from "../shared/worker-utils.js";
|
||||
import { HOOK_EXIT_CODES } from "../shared/hook-constants.js";
|
||||
import { getWorkerRestartInstructions } from "../utils/error-messages.js";
|
||||
|
||||
try {
|
||||
// Ensure worker is running
|
||||
await ensureWorkerRunning();
|
||||
// Ensure worker is running
|
||||
await ensureWorkerRunning();
|
||||
|
||||
const port = getWorkerPort();
|
||||
const project = basename(process.cwd());
|
||||
const port = getWorkerPort();
|
||||
const project = basename(process.cwd());
|
||||
|
||||
// Fetch formatted context directly from worker API
|
||||
const response = await fetch(
|
||||
`http://127.0.0.1:${port}/api/context/inject?project=${encodeURIComponent(project)}&colors=true`,
|
||||
{ method: 'GET', signal: AbortSignal.timeout(5000) }
|
||||
);
|
||||
// Fetch formatted context directly from worker API
|
||||
const response = await fetch(
|
||||
`http://127.0.0.1:${port}/api/context/inject?project=${encodeURIComponent(project)}&colors=true`,
|
||||
{ method: 'GET', signal: AbortSignal.timeout(5000) }
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(getWorkerRestartInstructions({ includeSkillFallback: true }));
|
||||
}
|
||||
|
||||
const output = await response.text();
|
||||
|
||||
console.error(
|
||||
"\n\n📝 Claude-Mem Context Loaded\n" +
|
||||
" ℹ️ Note: This appears as stderr but is informational only\n\n" +
|
||||
output +
|
||||
"\n\n💡 New! Wrap all or part of any message with <private> ... </private> to prevent storing sensitive information in your observation history.\n" +
|
||||
"\n💬 Community https://discord.gg/J4wttp9vDu" +
|
||||
`\n📺 Watch live in browser http://localhost:${port}/\n`
|
||||
);
|
||||
|
||||
} catch (error) {
|
||||
// Context not available yet - likely first run or worker starting up
|
||||
console.error(`
|
||||
---
|
||||
🎉 Note: This appears under Plugin Hook Error, but it's not an error. That's the only option for
|
||||
user messages in Claude Code UI until a better method is provided.
|
||||
---
|
||||
|
||||
⚠️ Claude-Mem: First-Time Setup
|
||||
|
||||
Dependencies are installing in the background. This only happens once.
|
||||
|
||||
💡 TIPS:
|
||||
• Memories will start generating while you work
|
||||
• Use /init to write or update your CLAUDE.md for better project context
|
||||
• Try /clear after one session to see what context looks like
|
||||
|
||||
Thank you for installing Claude-Mem!
|
||||
|
||||
This message was not added to your startup context, so you can continue working as normal.
|
||||
`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch context: ${response.status}`);
|
||||
}
|
||||
|
||||
const output = await response.text();
|
||||
|
||||
console.error(
|
||||
"\n\n📝 Claude-Mem Context Loaded\n" +
|
||||
" ℹ️ Note: This appears as stderr but is informational only\n\n" +
|
||||
output +
|
||||
"\n\n💡 New! Wrap all or part of any message with <private> ... </private> to prevent storing sensitive information in your observation history.\n" +
|
||||
"\n💬 Community https://discord.gg/J4wttp9vDu" +
|
||||
`\n📺 Watch live in browser http://localhost:${port}/\n`
|
||||
);
|
||||
|
||||
process.exit(HOOK_EXIT_CODES.USER_MESSAGE_ONLY);
|
||||
@@ -1,403 +0,0 @@
|
||||
/**
|
||||
* Parser Regression Tests
|
||||
* Ensures v4.2.5 and v4.2.6 bugfixes remain stable
|
||||
*/
|
||||
|
||||
import { parseObservations, parseSummary } from './parser.js';
|
||||
|
||||
// ANSI color codes for output
|
||||
const GREEN = '\x1b[32m';
|
||||
const RED = '\x1b[31m';
|
||||
const YELLOW = '\x1b[33m';
|
||||
const RESET = '\x1b[0m';
|
||||
|
||||
let testsRun = 0;
|
||||
let testsPassed = 0;
|
||||
let testsFailed = 0;
|
||||
|
||||
function assert(condition: boolean, testName: string, errorMsg?: string): void {
|
||||
testsRun++;
|
||||
if (condition) {
|
||||
testsPassed++;
|
||||
console.log(`${GREEN}✓${RESET} ${testName}`);
|
||||
} else {
|
||||
testsFailed++;
|
||||
console.log(`${RED}✗${RESET} ${testName}`);
|
||||
if (errorMsg) {
|
||||
console.log(` ${RED}${errorMsg}${RESET}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function assertEqual<T>(actual: T, expected: T, testName: string): void {
|
||||
const isEqual = JSON.stringify(actual) === JSON.stringify(expected);
|
||||
if (!isEqual) {
|
||||
assert(false, testName, `Expected: ${JSON.stringify(expected)}, Got: ${JSON.stringify(actual)}`);
|
||||
} else {
|
||||
assert(true, testName);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n' + YELLOW + '='.repeat(60) + RESET);
|
||||
console.log(YELLOW + 'Parser Regression Tests (v4.2.5 & v4.2.6)' + RESET);
|
||||
console.log(YELLOW + '='.repeat(60) + RESET + '\n');
|
||||
|
||||
// ============================================================================
|
||||
// v4.2.6: Observation Parsing - NEVER Skip Observations
|
||||
// ============================================================================
|
||||
|
||||
console.log(YELLOW + '\nv4.2.6: Observation Validation Fixes' + RESET);
|
||||
console.log('─'.repeat(60) + '\n');
|
||||
|
||||
// Test 1: Observation with missing title should be saved
|
||||
const missingTitleXml = `
|
||||
<observation>
|
||||
<type>feature</type>
|
||||
<subtitle>Added new feature</subtitle>
|
||||
<narrative>Implemented the feature successfully</narrative>
|
||||
<facts>
|
||||
<fact>Created new file</fact>
|
||||
</facts>
|
||||
<concepts>
|
||||
<concept>authentication</concept>
|
||||
</concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified>
|
||||
<file>src/app.ts</file>
|
||||
</files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const missingTitleResult = parseObservations(missingTitleXml);
|
||||
assert(missingTitleResult.length === 1, 'Should parse observation with missing title');
|
||||
assert(missingTitleResult[0].title === null, 'Missing title should be null');
|
||||
assertEqual(missingTitleResult[0].type, 'feature', 'Should preserve type when title missing');
|
||||
|
||||
// Test 2: Observation with missing subtitle should be saved
|
||||
const missingSubtitleXml = `
|
||||
<observation>
|
||||
<type>bugfix</type>
|
||||
<title>Fixed critical bug</title>
|
||||
<narrative>Resolved the issue</narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const missingSubtitleResult = parseObservations(missingSubtitleXml);
|
||||
assert(missingSubtitleResult.length === 1, 'Should parse observation with missing subtitle');
|
||||
assert(missingSubtitleResult[0].subtitle === null, 'Missing subtitle should be null');
|
||||
assertEqual(missingSubtitleResult[0].title, 'Fixed critical bug', 'Should preserve title when subtitle missing');
|
||||
|
||||
// Test 3: Observation with missing narrative should be saved
|
||||
const missingNarrativeXml = `
|
||||
<observation>
|
||||
<type>refactor</type>
|
||||
<title>Code cleanup</title>
|
||||
<subtitle>Improved structure</subtitle>
|
||||
<facts>
|
||||
<fact>Removed dead code</fact>
|
||||
</facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const missingNarrativeResult = parseObservations(missingNarrativeXml);
|
||||
assert(missingNarrativeResult.length === 1, 'Should parse observation with missing narrative');
|
||||
assert(missingNarrativeResult[0].narrative === null, 'Missing narrative should be null');
|
||||
assertEqual(missingNarrativeResult[0].facts, ['Removed dead code'], 'Should preserve facts when narrative missing');
|
||||
|
||||
// Test 4: Observation with ALL fields missing (except type) should be saved
|
||||
const minimalObservationXml = `
|
||||
<observation>
|
||||
<type>change</type>
|
||||
<title></title>
|
||||
<subtitle></subtitle>
|
||||
<narrative></narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const minimalResult = parseObservations(minimalObservationXml);
|
||||
assert(minimalResult.length === 1, 'Should parse minimal observation with only type');
|
||||
assertEqual(minimalResult[0].type, 'change', 'Should preserve type for minimal observation');
|
||||
assert(minimalResult[0].title === null, 'Empty title should be null');
|
||||
assert(minimalResult[0].subtitle === null, 'Empty subtitle should be null');
|
||||
assert(minimalResult[0].narrative === null, 'Empty narrative should be null');
|
||||
|
||||
// Test 5: Observation with missing type should use "change" as fallback
|
||||
const missingTypeXml = `
|
||||
<observation>
|
||||
<title>Something happened</title>
|
||||
<subtitle>Details here</subtitle>
|
||||
<narrative>More info</narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const missingTypeResult = parseObservations(missingTypeXml);
|
||||
assert(missingTypeResult.length === 1, 'Should parse observation with missing type');
|
||||
assertEqual(missingTypeResult[0].type, 'change', 'Missing type should default to "change"');
|
||||
|
||||
// Test 6: Observation with invalid type should use "change" as fallback
|
||||
const invalidTypeXml = `
|
||||
<observation>
|
||||
<type>invalid_type_here</type>
|
||||
<title>Something happened</title>
|
||||
<subtitle>Details here</subtitle>
|
||||
<narrative>More info</narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const invalidTypeResult = parseObservations(invalidTypeXml);
|
||||
assert(invalidTypeResult.length === 1, 'Should parse observation with invalid type');
|
||||
assertEqual(invalidTypeResult[0].type, 'change', 'Invalid type should default to "change"');
|
||||
|
||||
// Test 7: Multiple observations with mixed completeness should all be saved
|
||||
const mixedObservationsXml = `
|
||||
<observation>
|
||||
<type>feature</type>
|
||||
<title>Full observation</title>
|
||||
<subtitle>Complete</subtitle>
|
||||
<narrative>All fields present</narrative>
|
||||
<facts><fact>Fact 1</fact></facts>
|
||||
<concepts><concept>concept1</concept></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
<observation>
|
||||
<type>bugfix</type>
|
||||
<subtitle>Only subtitle and type</subtitle>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
<observation>
|
||||
<title>Only title, no type</title>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const mixedResult = parseObservations(mixedObservationsXml);
|
||||
assertEqual(mixedResult.length, 3, 'Should parse all three observations regardless of completeness');
|
||||
assertEqual(mixedResult[0].type, 'feature', 'First observation should have correct type');
|
||||
assertEqual(mixedResult[1].type, 'bugfix', 'Second observation should have correct type');
|
||||
assertEqual(mixedResult[2].type, 'change', 'Third observation should default to "change"');
|
||||
|
||||
// ============================================================================
|
||||
// v4.2.5: Summary Parsing - NEVER Skip Summaries
|
||||
// ============================================================================
|
||||
|
||||
console.log(YELLOW + '\nv4.2.5: Summary Validation Fixes' + RESET);
|
||||
console.log('─'.repeat(60) + '\n');
|
||||
|
||||
// Test 8: Summary with missing request field should be saved
|
||||
const missingRequestXml = `
|
||||
<summary>
|
||||
<investigated>Looked into the codebase</investigated>
|
||||
<learned>Found the issue</learned>
|
||||
<completed>Fixed the bug</completed>
|
||||
<next_steps>Deploy to production</next_steps>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const missingRequestResult = parseSummary(missingRequestXml);
|
||||
assert(missingRequestResult !== null, 'Should parse summary with missing request');
|
||||
assert(missingRequestResult!.request === null, 'Missing request should be null');
|
||||
assertEqual(missingRequestResult!.investigated, 'Looked into the codebase', 'Should preserve other fields');
|
||||
|
||||
// Test 9: Summary with missing investigated field should be saved
|
||||
const missingInvestigatedXml = `
|
||||
<summary>
|
||||
<request>Fix the bug</request>
|
||||
<learned>Root cause identified</learned>
|
||||
<completed>Applied the fix</completed>
|
||||
<next_steps>Monitor production</next_steps>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const missingInvestigatedResult = parseSummary(missingInvestigatedXml);
|
||||
assert(missingInvestigatedResult !== null, 'Should parse summary with missing investigated');
|
||||
assert(missingInvestigatedResult!.investigated === null, 'Missing investigated should be null');
|
||||
|
||||
// Test 10: Summary with missing learned field should be saved
|
||||
const missingLearnedXml = `
|
||||
<summary>
|
||||
<request>Add new feature</request>
|
||||
<investigated>Reviewed the requirements</investigated>
|
||||
<completed>Implemented the feature</completed>
|
||||
<next_steps>Write tests</next_steps>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const missingLearnedResult = parseSummary(missingLearnedXml);
|
||||
assert(missingLearnedResult !== null, 'Should parse summary with missing learned');
|
||||
assert(missingLearnedResult!.learned === null, 'Missing learned should be null');
|
||||
|
||||
// Test 11: Summary with missing completed field should be saved
|
||||
const missingCompletedXml = `
|
||||
<summary>
|
||||
<request>Refactor code</request>
|
||||
<investigated>Analyzed the structure</investigated>
|
||||
<learned>Found improvement opportunities</learned>
|
||||
<next_steps>Continue refactoring</next_steps>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const missingCompletedResult = parseSummary(missingCompletedXml);
|
||||
assert(missingCompletedResult !== null, 'Should parse summary with missing completed');
|
||||
assert(missingCompletedResult!.completed === null, 'Missing completed should be null');
|
||||
|
||||
// Test 12: Summary with missing next_steps field should be saved
|
||||
const missingNextStepsXml = `
|
||||
<summary>
|
||||
<request>Review code</request>
|
||||
<investigated>Examined all files</investigated>
|
||||
<learned>Code quality is good</learned>
|
||||
<completed>Review complete</completed>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const missingNextStepsResult = parseSummary(missingNextStepsXml);
|
||||
assert(missingNextStepsResult !== null, 'Should parse summary with missing next_steps');
|
||||
assert(missingNextStepsResult!.next_steps === null, 'Missing next_steps should be null');
|
||||
|
||||
// Test 13: Summary with only notes field should be saved
|
||||
const onlyNotesXml = `
|
||||
<summary>
|
||||
<notes>Some random notes</notes>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const onlyNotesResult = parseSummary(onlyNotesXml);
|
||||
assert(onlyNotesResult !== null, 'Should parse summary with only notes field');
|
||||
assertEqual(onlyNotesResult!.notes, 'Some random notes', 'Should preserve notes field');
|
||||
|
||||
// Test 14: Completely empty summary should be saved
|
||||
const emptySummaryXml = `
|
||||
<summary>
|
||||
<request></request>
|
||||
<investigated></investigated>
|
||||
<learned></learned>
|
||||
<completed></completed>
|
||||
<next_steps></next_steps>
|
||||
</summary>
|
||||
`;
|
||||
|
||||
const emptySummaryResult = parseSummary(emptySummaryXml);
|
||||
assert(emptySummaryResult !== null, 'Should parse completely empty summary');
|
||||
assert(emptySummaryResult!.request === null, 'Empty request should be null');
|
||||
assert(emptySummaryResult!.investigated === null, 'Empty investigated should be null');
|
||||
|
||||
// Test 15: Summary with skip_summary should return null (valid use case)
|
||||
const skipSummaryXml = `
|
||||
<skip_summary reason="Not enough context yet" />
|
||||
`;
|
||||
|
||||
const skipSummaryResult = parseSummary(skipSummaryXml);
|
||||
assert(skipSummaryResult === null, 'Should return null for skip_summary directive');
|
||||
|
||||
// ============================================================================
|
||||
// Edge Cases & Data Integrity
|
||||
// ============================================================================
|
||||
|
||||
console.log(YELLOW + '\nEdge Cases & Data Integrity' + RESET);
|
||||
console.log('─'.repeat(60) + '\n');
|
||||
|
||||
// Test 16: Observation with whitespace-only fields should be null
|
||||
const whitespaceObservationXml = `
|
||||
<observation>
|
||||
<type>change</type>
|
||||
<title> </title>
|
||||
<subtitle>
|
||||
|
||||
</subtitle>
|
||||
<narrative></narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const whitespaceResult = parseObservations(whitespaceObservationXml);
|
||||
assert(whitespaceResult.length === 1, 'Should parse observation with whitespace fields');
|
||||
assert(whitespaceResult[0].title === null || whitespaceResult[0].title!.trim() === '', 'Whitespace title should be null or empty');
|
||||
|
||||
// Test 17: Observation with concepts including type should filter out type
|
||||
const conceptsWithTypeXml = `
|
||||
<observation>
|
||||
<type>feature</type>
|
||||
<title>New feature</title>
|
||||
<subtitle>Details</subtitle>
|
||||
<narrative>Description</narrative>
|
||||
<facts></facts>
|
||||
<concepts>
|
||||
<concept>feature</concept>
|
||||
<concept>authentication</concept>
|
||||
</concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
|
||||
const conceptsWithTypeResult = parseObservations(conceptsWithTypeXml);
|
||||
assert(conceptsWithTypeResult.length === 1, 'Should parse observation with type in concepts');
|
||||
assertEqual(conceptsWithTypeResult[0].concepts, ['authentication'], 'Should filter out type from concepts');
|
||||
|
||||
// Test 18: Observation with all valid types
|
||||
const validTypes = ['decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change'];
|
||||
validTypes.forEach(type => {
|
||||
const typeXml = `
|
||||
<observation>
|
||||
<type>${type}</type>
|
||||
<title>Test</title>
|
||||
<subtitle>Test</subtitle>
|
||||
<narrative>Test</narrative>
|
||||
<facts></facts>
|
||||
<concepts></concepts>
|
||||
<files_read></files_read>
|
||||
<files_modified></files_modified>
|
||||
</observation>
|
||||
`;
|
||||
const result = parseObservations(typeXml);
|
||||
assertEqual(result[0].type, type, `Should accept valid type: ${type}`);
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Results Summary
|
||||
// ============================================================================
|
||||
|
||||
console.log('\n' + YELLOW + '='.repeat(60) + RESET);
|
||||
console.log(YELLOW + 'Test Results Summary' + RESET);
|
||||
console.log(YELLOW + '='.repeat(60) + RESET + '\n');
|
||||
|
||||
console.log(`Total Tests: ${testsRun}`);
|
||||
console.log(`${GREEN}Passed: ${testsPassed}${RESET}`);
|
||||
console.log(`${RED}Failed: ${testsFailed}${RESET}`);
|
||||
|
||||
if (testsFailed > 0) {
|
||||
console.log(`\n${RED}❌ TESTS FAILED${RESET}\n`);
|
||||
process.exit(1);
|
||||
} else {
|
||||
console.log(`\n${GREEN}✅ ALL TESTS PASSED${RESET}\n`);
|
||||
process.exit(0);
|
||||
}
|
||||
+9
-6
@@ -4,6 +4,7 @@
|
||||
*/
|
||||
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ModeManager } from '../services/domain/ModeManager.js';
|
||||
|
||||
export interface ParsedObservation {
|
||||
type: string;
|
||||
@@ -51,19 +52,21 @@ export function parseObservations(text: string, correlationId?: string): ParsedO
|
||||
|
||||
// NOTE FROM THEDOTMACK: ALWAYS save observations - never skip. 10/24/2025
|
||||
// All fields except type are nullable in schema
|
||||
// If type is missing or invalid, use "change" as catch-all fallback
|
||||
// If type is missing or invalid, use first type from mode as fallback
|
||||
|
||||
// Determine final type
|
||||
let finalType = 'change'; // Default catch-all
|
||||
// Determine final type using active mode's valid types
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
const validTypes = mode.observation_types.map(t => t.id);
|
||||
const fallbackType = validTypes[0]; // First type in mode's list is the fallback
|
||||
let finalType = fallbackType;
|
||||
if (type) {
|
||||
const validTypes = ['bugfix', 'feature', 'refactor', 'change', 'discovery', 'decision'];
|
||||
if (validTypes.includes(type.trim())) {
|
||||
finalType = type.trim();
|
||||
} else {
|
||||
logger.warn('PARSER', `Invalid observation type: ${type}, using "change"`, { correlationId });
|
||||
logger.warn('PARSER', `Invalid observation type: ${type}, using "${fallbackType}"`, { correlationId });
|
||||
}
|
||||
} else {
|
||||
logger.warn('PARSER', 'Observation missing type field, using "change"', { correlationId });
|
||||
logger.warn('PARSER', `Observation missing type field, using "${fallbackType}"`, { correlationId });
|
||||
}
|
||||
|
||||
// All other fields are optional - save whatever we have
|
||||
|
||||
+69
-164
@@ -4,6 +4,7 @@
|
||||
*/
|
||||
|
||||
import { logger } from '../utils/logger.js';
|
||||
import type { ModeConfig } from '../services/domain/types.js';
|
||||
|
||||
export interface Observation {
|
||||
id: number;
|
||||
@@ -26,123 +27,63 @@ export interface SDKSession {
|
||||
/**
|
||||
* Build initial prompt to initialize the SDK agent
|
||||
*/
|
||||
export function buildInitPrompt(project: string, sessionId: string, userPrompt: string): string {
|
||||
return `You are a Claude-Mem, a specialized observer tool for creating searchable memory FOR FUTURE SESSIONS.
|
||||
|
||||
CRITICAL: Record what was LEARNED/BUILT/FIXED/DEPLOYED/CONFIGURED, not what you (the observer) are doing.
|
||||
|
||||
You do not have access to tools. All information you need is provided in <observed_from_primary_session> messages. Create observations from what you observe - no investigation needed.
|
||||
export function buildInitPrompt(project: string, sessionId: string, userPrompt: string, mode: ModeConfig): string {
|
||||
return `${mode.prompts.system_identity}
|
||||
|
||||
<observed_from_primary_session>
|
||||
<user_request>${userPrompt}</user_request>
|
||||
<requested_at>${new Date().toISOString().split('T')[0]}</requested_at>
|
||||
</observed_from_primary_session>
|
||||
|
||||
Your job is to monitor a different Claude Code session happening RIGHT NOW, with the goal of creating observations and progress summaries as the work is being done LIVE by the user. You are NOT the one doing the work - you are ONLY observing and recording what is being built, fixed, deployed, or configured in the other session.
|
||||
${mode.prompts.observer_role}
|
||||
|
||||
SPATIAL AWARENESS: Tool executions include the working directory (tool_cwd) to help you understand:
|
||||
- Which repository/project is being worked on
|
||||
- Where files are located relative to the project root
|
||||
- How to match requested paths to actual execution paths
|
||||
${mode.prompts.spatial_awareness}
|
||||
|
||||
WHAT TO RECORD
|
||||
--------------
|
||||
Focus on deliverables and capabilities:
|
||||
- What the system NOW DOES differently (new capabilities)
|
||||
- What shipped to users/production (features, fixes, configs, docs)
|
||||
- Changes in technical domains (auth, data, UI, infra, DevOps, docs)
|
||||
${mode.prompts.recording_focus}
|
||||
|
||||
Use verbs like: implemented, fixed, deployed, configured, migrated, optimized, added, refactored
|
||||
${mode.prompts.skip_guidance}
|
||||
|
||||
✅ GOOD EXAMPLES (describes what was built):
|
||||
- "Authentication now supports OAuth2 with PKCE flow"
|
||||
- "Deployment pipeline runs canary releases with auto-rollback"
|
||||
- "Database indexes optimized for common query patterns"
|
||||
|
||||
❌ BAD EXAMPLES (describes observation process - DO NOT DO THIS):
|
||||
- "Analyzed authentication implementation and stored findings"
|
||||
- "Tracked deployment steps and logged outcomes"
|
||||
- "Monitored database performance and recorded metrics"
|
||||
|
||||
WHEN TO SKIP
|
||||
------------
|
||||
Skip routine operations:
|
||||
- Empty status checks
|
||||
- Package installations with no errors
|
||||
- Simple file listings
|
||||
- Repetitive operations you've already documented
|
||||
- If file related research comes back as empty or not found
|
||||
- **No output necessary if skipping.**
|
||||
|
||||
OUTPUT FORMAT
|
||||
-------------
|
||||
Output observations using this XML structure:
|
||||
${mode.prompts.output_format_header}
|
||||
|
||||
\`\`\`xml
|
||||
<observation>
|
||||
<type>[ bugfix | feature | refactor | change | discovery | decision ]</type>
|
||||
<type>[ ${mode.observation_types.map(t => t.id).join(' | ')} ]</type>
|
||||
<!--
|
||||
**type**: MUST be EXACTLY one of these 6 options (no other values allowed):
|
||||
- bugfix: something was broken, now fixed
|
||||
- feature: new capability or functionality added
|
||||
- refactor: code restructured, behavior unchanged
|
||||
- change: generic modification (docs, config, misc)
|
||||
- discovery: learning about existing system
|
||||
- decision: architectural/design choice with rationale
|
||||
${mode.prompts.type_guidance}
|
||||
-->
|
||||
<title>[**title**: Short title capturing the core action or topic]</title>
|
||||
<subtitle>[**subtitle**: One sentence explanation (max 24 words)]</subtitle>
|
||||
<title>${mode.prompts.xml_title_placeholder}</title>
|
||||
<subtitle>${mode.prompts.xml_subtitle_placeholder}</subtitle>
|
||||
<facts>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
</facts>
|
||||
<!--
|
||||
**facts**: Concise, self-contained statements
|
||||
Each fact is ONE piece of information
|
||||
No pronouns - each fact must stand alone
|
||||
Include specific details: filenames, functions, values
|
||||
${mode.prompts.field_guidance}
|
||||
-->
|
||||
<narrative>[**narrative**: Full context: What was done, how it works, why it matters]</narrative>
|
||||
<narrative>${mode.prompts.xml_narrative_placeholder}</narrative>
|
||||
<concepts>
|
||||
<concept>[knowledge-type-category]</concept>
|
||||
<concept>[knowledge-type-category]</concept>
|
||||
<concept>${mode.prompts.xml_concept_placeholder}</concept>
|
||||
<concept>${mode.prompts.xml_concept_placeholder}</concept>
|
||||
</concepts>
|
||||
<!--
|
||||
**concepts**: 2-5 knowledge-type categories. MUST use ONLY these exact keywords:
|
||||
- how-it-works: understanding mechanisms
|
||||
- why-it-exists: purpose or rationale
|
||||
- what-changed: modifications made
|
||||
- problem-solution: issues and their fixes
|
||||
- gotcha: traps or edge cases
|
||||
- pattern: reusable approach
|
||||
- trade-off: pros/cons of a decision
|
||||
|
||||
IMPORTANT: Do NOT include the observation type (change/discovery/decision) as a concept.
|
||||
Types and concepts are separate dimensions.
|
||||
${mode.prompts.concept_guidance}
|
||||
-->
|
||||
<files_read>
|
||||
<file>[path/to/file]</file>
|
||||
<file>[path/to/file]</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
</files_read>
|
||||
<files_modified>
|
||||
<file>[path/to/file]</file>
|
||||
<file>[path/to/file]</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
</files_modified>
|
||||
<!--
|
||||
**files**: All files touched (full paths from project root)
|
||||
-->
|
||||
</observation>
|
||||
\`\`\`
|
||||
${mode.prompts.format_examples}
|
||||
|
||||
IMPORTANT! DO NOT do any work right now other than generating this OBSERVATIONS from tool use messages - and remember that you are a memory agent designed to summarize a DIFFERENT claude code session, not this one.
|
||||
${mode.prompts.footer}
|
||||
|
||||
Never reference yourself or your own actions. Do not output anything other than the observation content formatted in the XML structure above. All other output is ignored by the system, and the system has been designed to be smart about token usage. Please spend your tokens wisely on useful observations.
|
||||
|
||||
Remember that we record these observations as a way of helping us stay on track with our progress, and to help us keep important decisions and changes at the forefront of our minds! :) Thank you so much for your help!
|
||||
|
||||
MEMORY PROCESSING START
|
||||
=======================`;
|
||||
${mode.prompts.header_memory_start}`;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -176,7 +117,7 @@ export function buildObservationPrompt(obs: Observation): string {
|
||||
/**
|
||||
* Build prompt to generate progress summary
|
||||
*/
|
||||
export function buildSummaryPrompt(session: SDKSession): string {
|
||||
export function buildSummaryPrompt(session: SDKSession, mode: ModeConfig): string {
|
||||
const lastAssistantMessage = session.last_assistant_message || logger.happyPathError(
|
||||
'SDK',
|
||||
'Missing last_assistant_message in session for summary prompt',
|
||||
@@ -185,28 +126,23 @@ export function buildSummaryPrompt(session: SDKSession): string {
|
||||
''
|
||||
);
|
||||
|
||||
return `PROGRESS SUMMARY CHECKPOINT
|
||||
===========================
|
||||
Write progress notes of what was done, what was learned, and what's next. This is a checkpoint to capture progress so far. The session is ongoing - you may receive more requests and tool executions after this summary. Write "next_steps" as the current trajectory of work (what's actively being worked on or coming up next), not as post-session future work. Always write at least a minimal summary explaining current progress, even if work is still in early stages, so that users see a summary output tied to each request.
|
||||
return `${mode.prompts.header_summary_checkpoint}
|
||||
${mode.prompts.summary_instruction}
|
||||
|
||||
Claude's Full Response to User:
|
||||
${mode.prompts.summary_context_label}
|
||||
${lastAssistantMessage}
|
||||
|
||||
Respond in this XML format:
|
||||
${mode.prompts.summary_format_instruction}
|
||||
<summary>
|
||||
<request>[Short title capturing the user's request AND the substance of what was discussed/done]</request>
|
||||
<investigated>[What has been explored so far? What was examined?]</investigated>
|
||||
<learned>[What have you learned about how things work?]</learned>
|
||||
<completed>[What work has been completed so far? What has shipped or changed?]</completed>
|
||||
<next_steps>[What are you actively working on or planning to work on next in this session?]</next_steps>
|
||||
<notes>[Additional insights or observations about the current progress]</notes>
|
||||
<request>${mode.prompts.xml_summary_request_placeholder}</request>
|
||||
<investigated>${mode.prompts.xml_summary_investigated_placeholder}</investigated>
|
||||
<learned>${mode.prompts.xml_summary_learned_placeholder}</learned>
|
||||
<completed>${mode.prompts.xml_summary_completed_placeholder}</completed>
|
||||
<next_steps>${mode.prompts.xml_summary_next_steps_placeholder}</next_steps>
|
||||
<notes>${mode.prompts.xml_summary_notes_placeholder}</notes>
|
||||
</summary>
|
||||
|
||||
IMPORTANT! DO NOT do any work right now other than generating this next PROGRESS SUMMARY - and remember that you are a memory agent designed to summarize a DIFFERENT claude code session, not this one.
|
||||
|
||||
Never reference yourself or your own actions. Do not output anything other than the summary content formatted in the XML structure above. All other output is ignored by the system, and the system has been designed to be smart about token usage. Please spend your tokens wisely on useful summary content.
|
||||
|
||||
Thank you, this summary will be very useful for keeping track of our progress!`;
|
||||
${mode.prompts.summary_footer}`;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -230,96 +166,65 @@ Thank you, this summary will be very useful for keeping track of our progress!`;
|
||||
* Called when: promptNumber > 1 (see SDKAgent.ts line 150)
|
||||
* First prompt: Uses buildInitPrompt instead (promptNumber === 1)
|
||||
*/
|
||||
export function buildContinuationPrompt(userPrompt: string, promptNumber: number, claudeSessionId: string): string {
|
||||
return `
|
||||
Hello memory agent, you are continuing to observe the primary Claude session.
|
||||
export function buildContinuationPrompt(userPrompt: string, promptNumber: number, claudeSessionId: string, mode: ModeConfig): string {
|
||||
return `${mode.prompts.continuation_greeting}
|
||||
|
||||
<observed_from_primary_session>
|
||||
<user_request>${userPrompt}</user_request>
|
||||
<requested_at>${new Date().toISOString().split('T')[0]}</requested_at>
|
||||
</observed_from_primary_session>
|
||||
|
||||
You do not have access to tools. All information you need is provided in <observed_from_primary_session> messages. Create observations from what you observe - no investigation needed.
|
||||
${mode.prompts.system_identity}
|
||||
|
||||
CRITICAL: Record what was LEARNED/BUILT/FIXED/DEPLOYED/CONFIGURED, not what you (the observer) are doing. Focus on deliverables and capabilities - what the system NOW DOES differently.
|
||||
${mode.prompts.observer_role}
|
||||
|
||||
WHEN TO SKIP
|
||||
------------
|
||||
Skip routine operations:
|
||||
- Empty status checks
|
||||
- Package installations with no errors
|
||||
- Simple file listings
|
||||
- Repetitive operations you've already documented
|
||||
- If file related research comes back as empty or not found
|
||||
- **No output necessary if skipping.**
|
||||
${mode.prompts.spatial_awareness}
|
||||
|
||||
IMPORTANT: Continue generating observations from tool use messages using the XML structure below.
|
||||
${mode.prompts.recording_focus}
|
||||
|
||||
OUTPUT FORMAT
|
||||
-------------
|
||||
Output observations using this XML structure:
|
||||
${mode.prompts.skip_guidance}
|
||||
|
||||
${mode.prompts.continuation_instruction}
|
||||
|
||||
${mode.prompts.output_format_header}
|
||||
|
||||
\`\`\`xml
|
||||
<observation>
|
||||
<type>[ bugfix | feature | refactor | change | discovery | decision ]</type>
|
||||
<type>[ ${mode.observation_types.map(t => t.id).join(' | ')} ]</type>
|
||||
<!--
|
||||
**type**: MUST be EXACTLY one of these 6 options (no other values allowed):
|
||||
- bugfix: something was broken, now fixed
|
||||
- feature: new capability or functionality added
|
||||
- refactor: code restructured, behavior unchanged
|
||||
- change: generic modification (docs, config, misc)
|
||||
- discovery: learning about existing system
|
||||
- decision: architectural/design choice with rationale
|
||||
${mode.prompts.type_guidance}
|
||||
-->
|
||||
<title>[**title**: Short title capturing the core action or topic]</title>
|
||||
<subtitle>[**subtitle**: One sentence explanation (max 24 words)]</subtitle>
|
||||
<title>${mode.prompts.xml_title_placeholder}</title>
|
||||
<subtitle>${mode.prompts.xml_subtitle_placeholder}</subtitle>
|
||||
<facts>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>[Concise, self-contained statement]</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
<fact>${mode.prompts.xml_fact_placeholder}</fact>
|
||||
</facts>
|
||||
<!--
|
||||
**facts**: Concise, self-contained statements
|
||||
Each fact is ONE piece of information
|
||||
No pronouns - each fact must stand alone
|
||||
Include specific details: filenames, functions, values
|
||||
${mode.prompts.field_guidance}
|
||||
-->
|
||||
<narrative>[**narrative**: Full context: What was done, how it works, why it matters]</narrative>
|
||||
<narrative>${mode.prompts.xml_narrative_placeholder}</narrative>
|
||||
<concepts>
|
||||
<concept>[knowledge-type-category]</concept>
|
||||
<concept>[knowledge-type-category]</concept>
|
||||
<concept>${mode.prompts.xml_concept_placeholder}</concept>
|
||||
<concept>${mode.prompts.xml_concept_placeholder}</concept>
|
||||
</concepts>
|
||||
<!--
|
||||
**concepts**: 2-5 knowledge-type categories. MUST use ONLY these exact keywords:
|
||||
- how-it-works: understanding mechanisms
|
||||
- why-it-exists: purpose or rationale
|
||||
- what-changed: modifications made
|
||||
- problem-solution: issues and their fixes
|
||||
- gotcha: traps or edge cases
|
||||
- pattern: reusable approach
|
||||
- trade-off: pros/cons of a decision
|
||||
|
||||
IMPORTANT: Do NOT include the observation type (change/discovery/decision) as a concept.
|
||||
Types and concepts are separate dimensions.
|
||||
${mode.prompts.concept_guidance}
|
||||
-->
|
||||
<files_read>
|
||||
<file>[path/to/file]</file>
|
||||
<file>[path/to/file]</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
</files_read>
|
||||
<files_modified>
|
||||
<file>[path/to/file]</file>
|
||||
<file>[path/to/file]</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
<file>${mode.prompts.xml_file_placeholder}</file>
|
||||
</files_modified>
|
||||
<!--
|
||||
**files**: All files touched (full paths from project root)
|
||||
-->
|
||||
</observation>
|
||||
\`\`\`
|
||||
${mode.prompts.format_examples}
|
||||
|
||||
Never reference yourself or your own actions. Do not output anything other than the observation content formatted in the XML structure above. All other output is ignored by the system, and the system has been designed to be smart about token usage. Please spend your tokens wisely on useful observations.
|
||||
${mode.prompts.footer}
|
||||
|
||||
Remember that we record these observations as a way of helping us stay on track with our progress, and to help us keep important decisions and changes at the forefront of our minds! :) Thank you so much for your continued help!
|
||||
|
||||
MEMORY PROCESSING CONTINUED
|
||||
===========================`;
|
||||
${mode.prompts.header_memory_continued}`;
|
||||
}
|
||||
@@ -9,12 +9,6 @@ import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { existsSync, readFileSync, unlinkSync } from 'fs';
|
||||
import { SessionStore } from './sqlite/SessionStore.js';
|
||||
import {
|
||||
OBSERVATION_TYPES,
|
||||
OBSERVATION_CONCEPTS,
|
||||
TYPE_ICON_MAP,
|
||||
TYPE_WORK_EMOJI_MAP
|
||||
} from '../constants/observation-metadata.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { SettingsDefaultsManager } from '../shared/SettingsDefaultsManager.js';
|
||||
import {
|
||||
@@ -26,6 +20,7 @@ import {
|
||||
extractFirstFile
|
||||
} from '../shared/timeline-formatting.js';
|
||||
import { getProjectName } from '../utils/project-name.js';
|
||||
import { ModeManager } from './domain/ModeManager.js';
|
||||
|
||||
// Version marker path - use homedir-based path that works in both CJS and ESM contexts
|
||||
const VERSION_MARKER_PATH = path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack', 'plugin', '.install-version');
|
||||
@@ -60,43 +55,24 @@ function loadContextConfig(): ContextConfig {
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
const settings = SettingsDefaultsManager.loadFromFile(settingsPath);
|
||||
|
||||
try {
|
||||
return {
|
||||
totalObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10),
|
||||
fullObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_FULL_COUNT, 10),
|
||||
sessionCount: parseInt(settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT, 10),
|
||||
showReadTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS === 'true',
|
||||
showWorkTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS === 'true',
|
||||
showSavingsAmount: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT === 'true',
|
||||
showSavingsPercent: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT === 'true',
|
||||
observationTypes: new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim()).filter(Boolean)
|
||||
),
|
||||
observationConcepts: new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim()).filter(Boolean)
|
||||
),
|
||||
fullObservationField: settings.CLAUDE_MEM_CONTEXT_FULL_FIELD as 'narrative' | 'facts',
|
||||
showLastSummary: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY === 'true',
|
||||
showLastMessage: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE === 'true',
|
||||
};
|
||||
} catch (error) {
|
||||
logger.warn('WORKER', 'Failed to load context settings, using defaults', {}, error as Error);
|
||||
// Return defaults on error
|
||||
return {
|
||||
totalObservationCount: 50,
|
||||
fullObservationCount: 5,
|
||||
sessionCount: 10,
|
||||
showReadTokens: true,
|
||||
showWorkTokens: true,
|
||||
showSavingsAmount: true,
|
||||
showSavingsPercent: true,
|
||||
observationTypes: new Set(OBSERVATION_TYPES),
|
||||
observationConcepts: new Set(OBSERVATION_CONCEPTS),
|
||||
fullObservationField: 'narrative' as const,
|
||||
showLastSummary: true,
|
||||
showLastMessage: false,
|
||||
};
|
||||
}
|
||||
return {
|
||||
totalObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10),
|
||||
fullObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_FULL_COUNT, 10),
|
||||
sessionCount: parseInt(settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT, 10),
|
||||
showReadTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS === 'true',
|
||||
showWorkTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS === 'true',
|
||||
showSavingsAmount: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT === 'true',
|
||||
showSavingsPercent: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT === 'true',
|
||||
observationTypes: new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim()).filter(Boolean)
|
||||
),
|
||||
observationConcepts: new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim()).filter(Boolean)
|
||||
),
|
||||
fullObservationField: settings.CLAUDE_MEM_CONTEXT_FULL_FIELD as 'narrative' | 'facts',
|
||||
showLastSummary: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY === 'true',
|
||||
showLastMessage: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
// Configuration constants
|
||||
@@ -280,20 +256,16 @@ export async function generateContext(input?: ContextInput, useColors: boolean =
|
||||
let priorAssistantMessage = '';
|
||||
|
||||
if (config.showLastMessage && observations.length > 0) {
|
||||
try {
|
||||
const currentSessionId = input?.session_id;
|
||||
const priorSessionObs = observations.find(obs => obs.sdk_session_id !== currentSessionId);
|
||||
const currentSessionId = input?.session_id;
|
||||
const priorSessionObs = observations.find(obs => obs.sdk_session_id !== currentSessionId);
|
||||
|
||||
if (priorSessionObs) {
|
||||
const priorSessionId = priorSessionObs.sdk_session_id;
|
||||
const dashedCwd = cwdToDashed(cwd);
|
||||
const transcriptPath = path.join(homedir(), '.claude', 'projects', dashedCwd, `${priorSessionId}.jsonl`);
|
||||
const messages = extractPriorMessages(transcriptPath);
|
||||
priorUserMessage = messages.userMessage;
|
||||
priorAssistantMessage = messages.assistantMessage;
|
||||
}
|
||||
} catch (error) {
|
||||
// Expected: Transcript file may not exist or be readable
|
||||
if (priorSessionObs) {
|
||||
const priorSessionId = priorSessionObs.sdk_session_id;
|
||||
const dashedCwd = cwdToDashed(cwd);
|
||||
const transcriptPath = path.join(homedir(), '.claude', 'projects', dashedCwd, `${priorSessionId}.jsonl`);
|
||||
const messages = extractPriorMessages(transcriptPath);
|
||||
priorUserMessage = messages.userMessage;
|
||||
priorAssistantMessage = messages.assistantMessage;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -325,11 +297,13 @@ export async function generateContext(input?: ContextInput, useColors: boolean =
|
||||
|
||||
// Chronological Timeline
|
||||
if (timelineObs.length > 0) {
|
||||
// Legend
|
||||
// Legend - generate dynamically from active mode
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
const typeLegendItems = mode.observation_types.map(t => `${t.emoji} ${t.id}`).join(' | ');
|
||||
if (useColors) {
|
||||
output.push(`${colors.dim}Legend: 🎯 session-request | 🔴 bugfix | 🟣 feature | 🔄 refactor | ✅ change | 🔵 discovery | ⚖️ decision${colors.reset}`);
|
||||
output.push(`${colors.dim}Legend: 🎯 session-request | ${typeLegendItems}${colors.reset}`);
|
||||
} else {
|
||||
output.push(`**Legend:** 🎯 session-request | 🔴 bugfix | 🟣 feature | 🔄 refactor | ✅ change | 🔵 discovery | ⚖️ decision`);
|
||||
output.push(`**Legend:** 🎯 session-request | ${typeLegendItems}`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
@@ -536,7 +510,7 @@ export async function generateContext(input?: ContextInput, useColors: boolean =
|
||||
|
||||
const time = formatTime(obs.created_at);
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = TYPE_ICON_MAP[obs.type as keyof typeof TYPE_ICON_MAP] || '•';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
|
||||
const obsSize = (obs.title?.length || 0) +
|
||||
(obs.subtitle?.length || 0) +
|
||||
@@ -544,7 +518,7 @@ export async function generateContext(input?: ContextInput, useColors: boolean =
|
||||
JSON.stringify(obs.facts || []).length;
|
||||
const readTokens = Math.ceil(obsSize / CHARS_PER_TOKEN_ESTIMATE);
|
||||
const discoveryTokens = obs.discovery_tokens || 0;
|
||||
const workEmoji = TYPE_WORK_EMOJI_MAP[obs.type as keyof typeof TYPE_WORK_EMOJI_MAP] || '🔍';
|
||||
const workEmoji = ModeManager.getInstance().getWorkEmoji(obs.type);
|
||||
const discoveryDisplay = discoveryTokens > 0 ? `${workEmoji} ${discoveryTokens.toLocaleString()}` : '-';
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
|
||||
@@ -0,0 +1,254 @@
|
||||
/**
|
||||
* ModeManager - Singleton for loading and managing mode profiles
|
||||
*
|
||||
* Mode profiles define observation types, concepts, and prompts for different use cases.
|
||||
* Default mode is 'code' (software development). Other modes like 'email-investigation'
|
||||
* can be selected via CLAUDE_MEM_MODE setting.
|
||||
*/
|
||||
|
||||
import { readFileSync, existsSync } from 'fs';
|
||||
import { join } from 'path';
|
||||
import type { ModeConfig, ObservationType, ObservationConcept } from './types.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { getPackageRoot } from '../../shared/paths.js';
|
||||
|
||||
export class ModeManager {
|
||||
private static instance: ModeManager | null = null;
|
||||
private activeMode: ModeConfig | null = null;
|
||||
private modesDir: string;
|
||||
|
||||
private constructor() {
|
||||
// Modes are in plugin/modes/
|
||||
// getPackageRoot() points to plugin/ in production and src/ in development
|
||||
// We want to ensure we find the modes directory which is at the project root/plugin/modes
|
||||
const packageRoot = getPackageRoot();
|
||||
|
||||
// Check for plugin/modes relative to package root (covers both dev and prod if paths are right)
|
||||
const possiblePaths = [
|
||||
join(packageRoot, 'modes'), // Production (plugin/modes)
|
||||
join(packageRoot, '..', 'plugin', 'modes'), // Development (src/../plugin/modes)
|
||||
];
|
||||
|
||||
const foundPath = possiblePaths.find(p => existsSync(p));
|
||||
this.modesDir = foundPath || possiblePaths[0];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get singleton instance
|
||||
*/
|
||||
static getInstance(): ModeManager {
|
||||
if (!ModeManager.instance) {
|
||||
ModeManager.instance = new ModeManager();
|
||||
}
|
||||
return ModeManager.instance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse mode ID for inheritance pattern (parent--override)
|
||||
*/
|
||||
private parseInheritance(modeId: string): {
|
||||
hasParent: boolean;
|
||||
parentId: string;
|
||||
overrideId: string;
|
||||
} {
|
||||
const parts = modeId.split('--');
|
||||
|
||||
if (parts.length === 1) {
|
||||
return { hasParent: false, parentId: '', overrideId: '' };
|
||||
}
|
||||
|
||||
// Support only one level: code--ko, not code--ko--verbose
|
||||
if (parts.length > 2) {
|
||||
throw new Error(
|
||||
`Invalid mode inheritance: ${modeId}. Only one level of inheritance supported (parent--override)`
|
||||
);
|
||||
}
|
||||
|
||||
return {
|
||||
hasParent: true,
|
||||
parentId: parts[0],
|
||||
overrideId: modeId // Use the full modeId (e.g., code--es) to find the override file
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if value is a plain object (not array, not null)
|
||||
*/
|
||||
private isPlainObject(value: unknown): boolean {
|
||||
return (
|
||||
value !== null &&
|
||||
typeof value === 'object' &&
|
||||
!Array.isArray(value)
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Deep merge two objects
|
||||
* - Recursively merge nested objects
|
||||
* - Replace arrays completely (no merging)
|
||||
* - Override primitives
|
||||
*/
|
||||
private deepMerge<T>(base: T, override: Partial<T>): T {
|
||||
const result = { ...base } as T;
|
||||
|
||||
for (const key in override) {
|
||||
const overrideValue = override[key];
|
||||
const baseValue = base[key];
|
||||
|
||||
if (this.isPlainObject(overrideValue) && this.isPlainObject(baseValue)) {
|
||||
// Recursively merge nested objects
|
||||
result[key] = this.deepMerge(baseValue, overrideValue as any);
|
||||
} else {
|
||||
// Replace arrays and primitives completely
|
||||
result[key] = overrideValue as T[Extract<keyof T, string>];
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load a mode file from disk without inheritance processing
|
||||
*/
|
||||
private loadModeFile(modeId: string): ModeConfig {
|
||||
const modePath = join(this.modesDir, `${modeId}.json`);
|
||||
|
||||
if (!existsSync(modePath)) {
|
||||
throw new Error(`Mode file not found: ${modePath}`);
|
||||
}
|
||||
|
||||
const jsonContent = readFileSync(modePath, 'utf-8');
|
||||
return JSON.parse(jsonContent) as ModeConfig;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load a mode profile by ID with inheritance support
|
||||
* Caches the result for subsequent calls
|
||||
*
|
||||
* Supports inheritance via parent--override pattern (e.g., code--ko)
|
||||
* - Loads parent mode recursively
|
||||
* - Loads override file from modes directory
|
||||
* - Deep merges override onto parent
|
||||
*/
|
||||
loadMode(modeId: string): ModeConfig {
|
||||
const inheritance = this.parseInheritance(modeId);
|
||||
|
||||
// No inheritance - load file directly (existing behavior)
|
||||
if (!inheritance.hasParent) {
|
||||
try {
|
||||
const mode = this.loadModeFile(modeId);
|
||||
this.activeMode = mode;
|
||||
logger.debug('SYSTEM', `Loaded mode: ${mode.name} (${modeId})`, undefined, {
|
||||
types: mode.observation_types.map(t => t.id),
|
||||
concepts: mode.observation_concepts.map(c => c.id)
|
||||
});
|
||||
return mode;
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', `Mode file not found: ${modeId}, falling back to 'code'`);
|
||||
// If we're already trying to load 'code', throw to prevent infinite recursion
|
||||
if (modeId === 'code') {
|
||||
throw new Error('Critical: code.json mode file missing');
|
||||
}
|
||||
return this.loadMode('code');
|
||||
}
|
||||
}
|
||||
|
||||
// Has inheritance - load parent and merge with override
|
||||
const { parentId, overrideId } = inheritance;
|
||||
|
||||
// Load parent mode recursively
|
||||
let parentMode: ModeConfig;
|
||||
try {
|
||||
parentMode = this.loadMode(parentId);
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', `Parent mode '${parentId}' not found for ${modeId}, falling back to 'code'`);
|
||||
parentMode = this.loadMode('code');
|
||||
}
|
||||
|
||||
// Load override file
|
||||
let overrideConfig: Partial<ModeConfig>;
|
||||
try {
|
||||
overrideConfig = this.loadModeFile(overrideId);
|
||||
logger.debug('SYSTEM', `Loaded override file: ${overrideId} for parent ${parentId}`);
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', `Override file '${overrideId}' not found, using parent mode '${parentId}' only`);
|
||||
this.activeMode = parentMode;
|
||||
return parentMode;
|
||||
}
|
||||
|
||||
// Validate override file loaded successfully
|
||||
if (!overrideConfig) {
|
||||
logger.warn('SYSTEM', `Invalid override file: ${overrideId}, using parent mode '${parentId}' only`);
|
||||
this.activeMode = parentMode;
|
||||
return parentMode;
|
||||
}
|
||||
|
||||
// Deep merge override onto parent
|
||||
const mergedMode = this.deepMerge(parentMode, overrideConfig);
|
||||
this.activeMode = mergedMode;
|
||||
|
||||
logger.debug('SYSTEM', `Loaded mode with inheritance: ${mergedMode.name} (${modeId} = ${parentId} + ${overrideId})`, undefined, {
|
||||
parent: parentId,
|
||||
override: overrideId,
|
||||
types: mergedMode.observation_types.map(t => t.id),
|
||||
concepts: mergedMode.observation_concepts.map(c => c.id)
|
||||
});
|
||||
|
||||
return mergedMode;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get currently active mode
|
||||
*/
|
||||
getActiveMode(): ModeConfig {
|
||||
if (!this.activeMode) {
|
||||
throw new Error('No mode loaded. Call loadMode() first.');
|
||||
}
|
||||
return this.activeMode;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all observation types from active mode
|
||||
*/
|
||||
getObservationTypes(): ObservationType[] {
|
||||
return this.getActiveMode().observation_types;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all observation concepts from active mode
|
||||
*/
|
||||
getObservationConcepts(): ObservationConcept[] {
|
||||
return this.getActiveMode().observation_concepts;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get icon for a specific observation type
|
||||
*/
|
||||
getTypeIcon(typeId: string): string {
|
||||
const type = this.getObservationTypes().find(t => t.id === typeId);
|
||||
return type?.emoji || '📝';
|
||||
}
|
||||
|
||||
/**
|
||||
* Get work emoji for a specific observation type
|
||||
*/
|
||||
getWorkEmoji(typeId: string): string {
|
||||
const type = this.getObservationTypes().find(t => t.id === typeId);
|
||||
return type?.work_emoji || '📝';
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate that a type ID exists in the active mode
|
||||
*/
|
||||
validateType(typeId: string): boolean {
|
||||
return this.getObservationTypes().some(t => t.id === typeId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get label for a specific observation type
|
||||
*/
|
||||
getTypeLabel(typeId: string): string {
|
||||
const type = this.getObservationTypes().find(t => t.id === typeId);
|
||||
return type?.label || typeId;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,72 @@
|
||||
/**
|
||||
* TypeScript interfaces for mode configuration system
|
||||
*/
|
||||
|
||||
export interface ObservationType {
|
||||
id: string;
|
||||
label: string;
|
||||
description: string;
|
||||
emoji: string;
|
||||
work_emoji: string;
|
||||
}
|
||||
|
||||
export interface ObservationConcept {
|
||||
id: string;
|
||||
label: string;
|
||||
description: string;
|
||||
}
|
||||
|
||||
export interface ModePrompts {
|
||||
system_identity: string; // Base persona and role definition
|
||||
language_instruction?: string; // Optional language constraints (e.g., "Write in Korean")
|
||||
spatial_awareness: string; // Working directory context guidance
|
||||
observer_role: string; // What the observer's job is in this mode
|
||||
recording_focus: string; // What to record and how to think about it
|
||||
skip_guidance: string; // What to skip recording
|
||||
type_guidance: string; // Valid observation types for this mode
|
||||
concept_guidance: string; // Valid concept categories for this mode
|
||||
field_guidance: string; // Guidance for facts/files fields
|
||||
output_format_header: string; // Text introducing the XML schema
|
||||
format_examples: string; // Optional additional XML examples (empty string if not needed)
|
||||
footer: string; // Closing instructions and encouragement
|
||||
|
||||
// Observation XML placeholders
|
||||
xml_title_placeholder: string; // e.g., "[**title**: Short title capturing the core action or topic]"
|
||||
xml_subtitle_placeholder: string; // e.g., "[**subtitle**: One sentence explanation (max 24 words)]"
|
||||
xml_fact_placeholder: string; // e.g., "[Concise, self-contained statement]"
|
||||
xml_narrative_placeholder: string; // e.g., "[**narrative**: Full context: What was done, how it works, why it matters]"
|
||||
xml_concept_placeholder: string; // e.g., "[knowledge-type-category]"
|
||||
xml_file_placeholder: string; // e.g., "[path/to/file]"
|
||||
|
||||
// Summary XML placeholders
|
||||
xml_summary_request_placeholder: string; // e.g., "[Short title capturing the user's request AND...]"
|
||||
xml_summary_investigated_placeholder: string; // e.g., "[What has been explored so far? What was examined?]"
|
||||
xml_summary_learned_placeholder: string; // e.g., "[What have you learned about how things work?]"
|
||||
xml_summary_completed_placeholder: string; // e.g., "[What work has been completed so far? What has shipped or changed?]"
|
||||
xml_summary_next_steps_placeholder: string; // e.g., "[What are you actively working on or planning to work on next in this session?]"
|
||||
xml_summary_notes_placeholder: string; // e.g., "[Additional insights or observations about the current progress]"
|
||||
|
||||
// Section headers (with separator lines)
|
||||
header_memory_start: string; // e.g., "MEMORY PROCESSING START\n======================="
|
||||
header_memory_continued: string; // e.g., "MEMORY PROCESSING CONTINUED\n==========================="
|
||||
header_summary_checkpoint: string; // e.g., "PROGRESS SUMMARY CHECKPOINT\n==========================="
|
||||
|
||||
// Continuation prompts
|
||||
continuation_greeting: string; // e.g., "Hello memory agent, you are continuing to observe the primary Claude session."
|
||||
continuation_instruction: string; // e.g., "IMPORTANT: Continue generating observations from tool use messages using the XML structure below."
|
||||
|
||||
// Summary prompts
|
||||
summary_instruction: string; // Instructions for writing progress summary
|
||||
summary_context_label: string; // Label for Claude's response section (e.g., "Claude's Full Response to User:")
|
||||
summary_format_instruction: string; // Instruction to use XML format (e.g., "Respond in this XML format:")
|
||||
summary_footer: string; // Footer with closing instructions and language requirement
|
||||
}
|
||||
|
||||
export interface ModeConfig {
|
||||
name: string;
|
||||
description: string;
|
||||
version: string;
|
||||
observation_types: ObservationType[];
|
||||
observation_concepts: ObservationConcept[];
|
||||
prompts: ModePrompts;
|
||||
}
|
||||
@@ -50,104 +50,100 @@ export class SessionSearch {
|
||||
* TODO: Remove FTS5 infrastructure in future major version (v7.0.0)
|
||||
*/
|
||||
private ensureFTSTables(): void {
|
||||
try {
|
||||
// Check if FTS tables already exist
|
||||
const tables = this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name LIKE '%_fts'").all() as TableNameRow[];
|
||||
const hasFTS = tables.some(t => t.name === 'observations_fts' || t.name === 'session_summaries_fts');
|
||||
// Check if FTS tables already exist
|
||||
const tables = this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name LIKE '%_fts'").all() as TableNameRow[];
|
||||
const hasFTS = tables.some(t => t.name === 'observations_fts' || t.name === 'session_summaries_fts');
|
||||
|
||||
if (hasFTS) {
|
||||
// Already migrated
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionSearch] Creating FTS5 tables...');
|
||||
|
||||
// Create observations_fts virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE IF NOT EXISTS observations_fts USING fts5(
|
||||
title,
|
||||
subtitle,
|
||||
narrative,
|
||||
text,
|
||||
facts,
|
||||
concepts,
|
||||
content='observations',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Populate with existing data
|
||||
this.db.run(`
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
SELECT id, title, subtitle, narrative, text, facts, concepts
|
||||
FROM observations;
|
||||
`);
|
||||
|
||||
// Create triggers for observations
|
||||
this.db.run(`
|
||||
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
|
||||
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
|
||||
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
||||
END;
|
||||
`);
|
||||
|
||||
// Create session_summaries_fts virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE IF NOT EXISTS session_summaries_fts USING fts5(
|
||||
request,
|
||||
investigated,
|
||||
learned,
|
||||
completed,
|
||||
next_steps,
|
||||
notes,
|
||||
content='session_summaries',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Populate with existing data
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
SELECT id, request, investigated, learned, completed, next_steps, notes
|
||||
FROM session_summaries;
|
||||
`);
|
||||
|
||||
// Create triggers for session_summaries
|
||||
this.db.run(`
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
||||
END;
|
||||
`);
|
||||
|
||||
console.log('[SessionSearch] FTS5 tables created successfully');
|
||||
} catch (error: any) {
|
||||
console.error('[SessionSearch] FTS migration error:', error.message);
|
||||
if (hasFTS) {
|
||||
// Already migrated
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionSearch] Creating FTS5 tables...');
|
||||
|
||||
// Create observations_fts virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE IF NOT EXISTS observations_fts USING fts5(
|
||||
title,
|
||||
subtitle,
|
||||
narrative,
|
||||
text,
|
||||
facts,
|
||||
concepts,
|
||||
content='observations',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Populate with existing data
|
||||
this.db.run(`
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
SELECT id, title, subtitle, narrative, text, facts, concepts
|
||||
FROM observations;
|
||||
`);
|
||||
|
||||
// Create triggers for observations
|
||||
this.db.run(`
|
||||
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
|
||||
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
|
||||
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
|
||||
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
|
||||
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
|
||||
END;
|
||||
`);
|
||||
|
||||
// Create session_summaries_fts virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE IF NOT EXISTS session_summaries_fts USING fts5(
|
||||
request,
|
||||
investigated,
|
||||
learned,
|
||||
completed,
|
||||
next_steps,
|
||||
notes,
|
||||
content='session_summaries',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Populate with existing data
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
SELECT id, request, investigated, learned, completed, next_steps, notes
|
||||
FROM session_summaries;
|
||||
`);
|
||||
|
||||
// Create triggers for session_summaries
|
||||
this.db.run(`
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
|
||||
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
|
||||
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
|
||||
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
|
||||
END;
|
||||
`);
|
||||
|
||||
console.log('[SessionSearch] FTS5 tables created successfully');
|
||||
}
|
||||
|
||||
|
||||
|
||||
+273
-305
@@ -144,152 +144,140 @@ export class SessionStore {
|
||||
* Ensure worker_port column exists (migration 5)
|
||||
*/
|
||||
private ensureWorkerPortColumn(): void {
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(5) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(5) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if column exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasWorkerPort = tableInfo.some(col => col.name === 'worker_port');
|
||||
// Check if column exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasWorkerPort = tableInfo.some(col => col.name === 'worker_port');
|
||||
|
||||
if (!hasWorkerPort) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER');
|
||||
console.log('[SessionStore] Added worker_port column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(5, new Date().toISOString());
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Migration error:', error.message);
|
||||
if (!hasWorkerPort) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER');
|
||||
console.log('[SessionStore] Added worker_port column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(5, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure prompt tracking columns exist (migration 6)
|
||||
*/
|
||||
private ensurePromptTrackingColumns(): void {
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(6) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(6) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check sdk_sessions for prompt_counter
|
||||
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasPromptCounter = sessionsInfo.some(col => col.name === 'prompt_counter');
|
||||
// Check sdk_sessions for prompt_counter
|
||||
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasPromptCounter = sessionsInfo.some(col => col.name === 'prompt_counter');
|
||||
|
||||
if (!hasPromptCounter) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0');
|
||||
console.log('[SessionStore] Added prompt_counter column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Check observations for prompt_number
|
||||
const observationsInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const obsHasPromptNumber = observationsInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!obsHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE observations ADD COLUMN prompt_number INTEGER');
|
||||
console.log('[SessionStore] Added prompt_number column to observations table');
|
||||
}
|
||||
|
||||
// Check session_summaries for prompt_number
|
||||
const summariesInfo = this.db.query('PRAGMA table_info(session_summaries)').all() as TableColumnInfo[];
|
||||
const sumHasPromptNumber = summariesInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!sumHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER');
|
||||
console.log('[SessionStore] Added prompt_number column to session_summaries table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(6, new Date().toISOString());
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Prompt tracking migration error:', error.message);
|
||||
if (!hasPromptCounter) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0');
|
||||
console.log('[SessionStore] Added prompt_counter column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Check observations for prompt_number
|
||||
const observationsInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const obsHasPromptNumber = observationsInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!obsHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE observations ADD COLUMN prompt_number INTEGER');
|
||||
console.log('[SessionStore] Added prompt_number column to observations table');
|
||||
}
|
||||
|
||||
// Check session_summaries for prompt_number
|
||||
const summariesInfo = this.db.query('PRAGMA table_info(session_summaries)').all() as TableColumnInfo[];
|
||||
const sumHasPromptNumber = summariesInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!sumHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER');
|
||||
console.log('[SessionStore] Added prompt_number column to session_summaries table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(6, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove UNIQUE constraint from session_summaries.sdk_session_id (migration 7)
|
||||
*/
|
||||
private removeSessionSummariesUniqueConstraint(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(7) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if UNIQUE constraint exists
|
||||
const summariesIndexes = this.db.query('PRAGMA index_list(session_summaries)').all() as IndexInfo[];
|
||||
const hasUniqueConstraint = summariesIndexes.some(idx => idx.unique === 1);
|
||||
|
||||
if (!hasUniqueConstraint) {
|
||||
// Already migrated (no constraint exists)
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id...');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(7) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Create new table without UNIQUE constraint
|
||||
this.db.run(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
learned TEXT,
|
||||
completed TEXT,
|
||||
next_steps TEXT,
|
||||
files_read TEXT,
|
||||
files_edited TEXT,
|
||||
notes TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Check if UNIQUE constraint exists
|
||||
const summariesIndexes = this.db.query('PRAGMA index_list(session_summaries)').all() as IndexInfo[];
|
||||
const hasUniqueConstraint = summariesIndexes.some(idx => idx.unique === 1);
|
||||
// Copy data from old table
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_new
|
||||
SELECT id, sdk_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
`);
|
||||
|
||||
if (!hasUniqueConstraint) {
|
||||
// Already migrated (no constraint exists)
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE session_summaries');
|
||||
|
||||
console.log('[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id...');
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE session_summaries_new RENAME TO session_summaries');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
try {
|
||||
// Create new table without UNIQUE constraint
|
||||
this.db.run(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
learned TEXT,
|
||||
completed TEXT,
|
||||
next_steps TEXT,
|
||||
files_read TEXT,
|
||||
files_edited TEXT,
|
||||
notes TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Copy data from old table
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_new
|
||||
SELECT id, sdk_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
`);
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE session_summaries');
|
||||
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE session_summaries_new RENAME TO session_summaries');
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
|
||||
console.log('[SessionStore] Successfully removed UNIQUE constraint from session_summaries.sdk_session_id');
|
||||
} catch (error: any) {
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
console.log('[SessionStore] Successfully removed UNIQUE constraint from session_summaries.sdk_session_id');
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Migration error (remove UNIQUE constraint):', error.message);
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -297,41 +285,37 @@ export class SessionStore {
|
||||
* Add hierarchical fields to observations table (migration 8)
|
||||
*/
|
||||
private addObservationHierarchicalFields(): void {
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(8) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(8) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if new fields already exist
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const hasTitle = tableInfo.some(col => col.name === 'title');
|
||||
// Check if new fields already exist
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const hasTitle = tableInfo.some(col => col.name === 'title');
|
||||
|
||||
if (hasTitle) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(8, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionStore] Adding hierarchical fields to observations table...');
|
||||
|
||||
// Add new columns
|
||||
this.db.run(`
|
||||
ALTER TABLE observations ADD COLUMN title TEXT;
|
||||
ALTER TABLE observations ADD COLUMN subtitle TEXT;
|
||||
ALTER TABLE observations ADD COLUMN facts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN narrative TEXT;
|
||||
ALTER TABLE observations ADD COLUMN concepts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_read TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_modified TEXT;
|
||||
`);
|
||||
|
||||
// Record migration
|
||||
if (hasTitle) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(8, new Date().toISOString());
|
||||
|
||||
console.log('[SessionStore] Successfully added hierarchical fields to observations table');
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Migration error (add hierarchical fields):', error.message);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionStore] Adding hierarchical fields to observations table...');
|
||||
|
||||
// Add new columns
|
||||
this.db.run(`
|
||||
ALTER TABLE observations ADD COLUMN title TEXT;
|
||||
ALTER TABLE observations ADD COLUMN subtitle TEXT;
|
||||
ALTER TABLE observations ADD COLUMN facts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN narrative TEXT;
|
||||
ALTER TABLE observations ADD COLUMN concepts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_read TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_modified TEXT;
|
||||
`);
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(8, new Date().toISOString());
|
||||
|
||||
console.log('[SessionStore] Successfully added hierarchical fields to observations table');
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -339,86 +323,82 @@ export class SessionStore {
|
||||
* The text field is deprecated in favor of structured fields (title, subtitle, narrative, etc.)
|
||||
*/
|
||||
private makeObservationsTextNullable(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(9) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if text column is already nullable
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const textColumn = tableInfo.find(col => col.name === 'text');
|
||||
|
||||
if (!textColumn || textColumn.notnull === 0) {
|
||||
// Already migrated or text column doesn't exist
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionStore] Making observations.text nullable...');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(9) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Create new table with text as nullable
|
||||
this.db.run(`
|
||||
CREATE TABLE observations_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change')),
|
||||
title TEXT,
|
||||
subtitle TEXT,
|
||||
facts TEXT,
|
||||
narrative TEXT,
|
||||
concepts TEXT,
|
||||
files_read TEXT,
|
||||
files_modified TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Check if text column is already nullable
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const textColumn = tableInfo.find(col => col.name === 'text');
|
||||
// Copy data from old table (all existing columns)
|
||||
this.db.run(`
|
||||
INSERT INTO observations_new
|
||||
SELECT id, sdk_session_id, project, text, type, title, subtitle, facts,
|
||||
narrative, concepts, files_read, files_modified, prompt_number,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
`);
|
||||
|
||||
if (!textColumn || textColumn.notnull === 0) {
|
||||
// Already migrated or text column doesn't exist
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE observations');
|
||||
|
||||
console.log('[SessionStore] Making observations.text nullable...');
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE observations_new RENAME TO observations');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_observations_sdk_session ON observations(sdk_session_id);
|
||||
CREATE INDEX idx_observations_project ON observations(project);
|
||||
CREATE INDEX idx_observations_type ON observations(type);
|
||||
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
try {
|
||||
// Create new table with text as nullable
|
||||
this.db.run(`
|
||||
CREATE TABLE observations_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change')),
|
||||
title TEXT,
|
||||
subtitle TEXT,
|
||||
facts TEXT,
|
||||
narrative TEXT,
|
||||
concepts TEXT,
|
||||
files_read TEXT,
|
||||
files_modified TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Copy data from old table (all existing columns)
|
||||
this.db.run(`
|
||||
INSERT INTO observations_new
|
||||
SELECT id, sdk_session_id, project, text, type, title, subtitle, facts,
|
||||
narrative, concepts, files_read, files_modified, prompt_number,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
`);
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE observations');
|
||||
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE observations_new RENAME TO observations');
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_observations_sdk_session ON observations(sdk_session_id);
|
||||
CREATE INDEX idx_observations_project ON observations(project);
|
||||
CREATE INDEX idx_observations_type ON observations(type);
|
||||
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
|
||||
console.log('[SessionStore] Successfully made observations.text nullable');
|
||||
} catch (error: any) {
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
console.log('[SessionStore] Successfully made observations.text nullable');
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Migration error (make text nullable):', error.message);
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -426,86 +406,82 @@ export class SessionStore {
|
||||
* Create user_prompts table with FTS5 support (migration 10)
|
||||
*/
|
||||
private createUserPromptsTable(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(10) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if table already exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(user_prompts)').all() as TableColumnInfo[];
|
||||
if (tableInfo.length > 0) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[SessionStore] Creating user_prompts table with FTS5 support...');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
try {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(10) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
// Create main table (using claude_session_id since sdk_session_id is set asynchronously by worker)
|
||||
this.db.run(`
|
||||
CREATE TABLE user_prompts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT NOT NULL,
|
||||
prompt_number INTEGER NOT NULL,
|
||||
prompt_text TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(claude_session_id) REFERENCES sdk_sessions(claude_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
// Check if table already exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(user_prompts)').all() as TableColumnInfo[];
|
||||
if (tableInfo.length > 0) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(claude_session_id);
|
||||
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
|
||||
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
|
||||
CREATE INDEX idx_user_prompts_lookup ON user_prompts(claude_session_id, prompt_number);
|
||||
`);
|
||||
|
||||
console.log('[SessionStore] Creating user_prompts table with FTS5 support...');
|
||||
// Create FTS5 virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
|
||||
prompt_text,
|
||||
content='user_prompts',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
// Create triggers to sync FTS5
|
||||
this.db.run(`
|
||||
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
|
||||
try {
|
||||
// Create main table (using claude_session_id since sdk_session_id is set asynchronously by worker)
|
||||
this.db.run(`
|
||||
CREATE TABLE user_prompts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT NOT NULL,
|
||||
prompt_number INTEGER NOT NULL,
|
||||
prompt_text TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(claude_session_id) REFERENCES sdk_sessions(claude_session_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
END;
|
||||
|
||||
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(claude_session_id);
|
||||
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
|
||||
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
|
||||
CREATE INDEX idx_user_prompts_lookup ON user_prompts(claude_session_id, prompt_number);
|
||||
`);
|
||||
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
`);
|
||||
|
||||
// Create FTS5 virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
|
||||
prompt_text,
|
||||
content='user_prompts',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Create triggers to sync FTS5
|
||||
this.db.run(`
|
||||
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
|
||||
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
|
||||
console.log('[SessionStore] Successfully created user_prompts table with FTS5 support');
|
||||
} catch (error: any) {
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
console.log('[SessionStore] Successfully created user_prompts table with FTS5 support');
|
||||
} catch (error: any) {
|
||||
console.error('[SessionStore] Migration error (create user_prompts table):', error.message);
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -990,25 +966,17 @@ export class SessionStore {
|
||||
for (const row of rows) {
|
||||
// Parse files_read
|
||||
if (row.files_read) {
|
||||
try {
|
||||
const files = JSON.parse(row.files_read);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesReadSet.add(f));
|
||||
}
|
||||
} catch {
|
||||
// Skip invalid JSON
|
||||
const files = JSON.parse(row.files_read);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesReadSet.add(f));
|
||||
}
|
||||
}
|
||||
|
||||
// Parse files_modified
|
||||
if (row.files_modified) {
|
||||
try {
|
||||
const files = JSON.parse(row.files_modified);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesModifiedSet.add(f));
|
||||
}
|
||||
} catch {
|
||||
// Skip invalid JSON
|
||||
const files = JSON.parse(row.files_modified);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesModifiedSet.add(f));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -495,6 +495,7 @@ export const migration007: Migration = {
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* All migrations in order
|
||||
*/
|
||||
|
||||
@@ -847,31 +847,21 @@ export class ChromaSync {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
// Close client first
|
||||
if (this.client) {
|
||||
try {
|
||||
await this.client.close();
|
||||
} catch (error) {
|
||||
logger.warn('CHROMA_SYNC', 'Error closing Chroma client', { project: this.project }, error as Error);
|
||||
}
|
||||
}
|
||||
|
||||
// Explicitly close transport to kill subprocess
|
||||
if (this.transport) {
|
||||
try {
|
||||
await this.transport.close();
|
||||
} catch (error) {
|
||||
logger.warn('CHROMA_SYNC', 'Error closing transport', { project: this.project }, error as Error);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('CHROMA_SYNC', 'Chroma client and subprocess closed', { project: this.project });
|
||||
} finally {
|
||||
// Always reset state, even if errors occurred
|
||||
this.connected = false;
|
||||
this.client = null;
|
||||
this.transport = null;
|
||||
// Close client first
|
||||
if (this.client) {
|
||||
await this.client.close();
|
||||
}
|
||||
|
||||
// Explicitly close transport to kill subprocess
|
||||
if (this.transport) {
|
||||
await this.transport.close();
|
||||
}
|
||||
|
||||
logger.info('CHROMA_SYNC', 'Chroma client and subprocess closed', { project: this.project });
|
||||
|
||||
// Always reset state
|
||||
this.connected = false;
|
||||
this.client = null;
|
||||
this.transport = null;
|
||||
}
|
||||
}
|
||||
|
||||
+91
-118
@@ -2,7 +2,7 @@
|
||||
* Worker Service - Slim Orchestrator
|
||||
*
|
||||
* Refactored from 2000-line monolith to ~150-line orchestrator.
|
||||
* Routes organized by domain in http/routes/*.ts
|
||||
* Routes organized by feature area in http/routes/*.ts
|
||||
* See src/services/worker/README.md for architecture details.
|
||||
*/
|
||||
|
||||
@@ -19,7 +19,7 @@ import { promisify } from 'util';
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
// Import composed domain services
|
||||
// Import composed service layer
|
||||
import { DatabaseManager } from './worker/DatabaseManager.js';
|
||||
import { SessionManager } from './worker/SessionManager.js';
|
||||
import { SSEBroadcaster } from './worker/SSEBroadcaster.js';
|
||||
@@ -49,7 +49,7 @@ export class WorkerService {
|
||||
private mcpReady: boolean = false;
|
||||
private initializationCompleteFlag: boolean = false;
|
||||
|
||||
// Domain services
|
||||
// Service layer
|
||||
private dbManager: DatabaseManager;
|
||||
private sessionManager: SessionManager;
|
||||
private sseBroadcaster: SSEBroadcaster;
|
||||
@@ -77,7 +77,7 @@ export class WorkerService {
|
||||
this.resolveInitialization = resolve;
|
||||
});
|
||||
|
||||
// Initialize domain services
|
||||
// Initialize service layer
|
||||
this.dbManager = new DatabaseManager();
|
||||
this.sessionManager = new SessionManager(this.dbManager);
|
||||
this.sseBroadcaster = new SSEBroadcaster();
|
||||
@@ -160,19 +160,9 @@ export class WorkerService {
|
||||
const marketplaceRoot = path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack');
|
||||
const packageJsonPath = path.join(marketplaceRoot, 'package.json');
|
||||
|
||||
try {
|
||||
// Read version from marketplace package.json
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
res.status(200).json({ version: packageJson.version });
|
||||
} catch (error) {
|
||||
logger.error('SYSTEM', 'Failed to read version', {
|
||||
packagePath: packageJsonPath
|
||||
}, error as Error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to read version',
|
||||
path: packageJsonPath
|
||||
});
|
||||
}
|
||||
// Read version from marketplace package.json
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
res.status(200).json({ version: packageJson.version });
|
||||
});
|
||||
|
||||
// Instructions endpoint - loads SKILL.md sections on-demand for progressive instruction loading
|
||||
@@ -326,83 +316,74 @@ export class WorkerService {
|
||||
* Prevents process accumulation and memory leaks
|
||||
*/
|
||||
private async cleanupOrphanedProcesses(): Promise<void> {
|
||||
try {
|
||||
const isWindows = process.platform === 'win32';
|
||||
const pids: number[] = [];
|
||||
const isWindows = process.platform === 'win32';
|
||||
const pids: number[] = [];
|
||||
|
||||
if (isWindows) {
|
||||
// Windows: Use PowerShell Get-CimInstance to find chroma-mcp processes
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.Name -like '*python*' -and $_.CommandLine -like '*chroma-mcp*' } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 5000 });
|
||||
if (isWindows) {
|
||||
// Windows: Use PowerShell Get-CimInstance to find chroma-mcp processes
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.Name -like '*python*' -and $_.CommandLine -like '*chroma-mcp*' } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 5000 });
|
||||
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Windows)');
|
||||
return;
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Windows)');
|
||||
return;
|
||||
}
|
||||
|
||||
const pidStrings = stdout.trim().split('\n');
|
||||
for (const pidStr of pidStrings) {
|
||||
const pid = parseInt(pidStr.trim(), 10);
|
||||
// SECURITY: Validate PID is positive integer before adding to list
|
||||
if (!isNaN(pid) && Number.isInteger(pid) && pid > 0) {
|
||||
pids.push(pid);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Unix: Use ps aux | grep
|
||||
const { stdout } = await execAsync('ps aux | grep "chroma-mcp" | grep -v grep || true');
|
||||
|
||||
const pidStrings = stdout.trim().split('\n');
|
||||
for (const pidStr of pidStrings) {
|
||||
const pid = parseInt(pidStr.trim(), 10);
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Unix)');
|
||||
return;
|
||||
}
|
||||
|
||||
const lines = stdout.trim().split('\n');
|
||||
for (const line of lines) {
|
||||
const parts = line.trim().split(/\s+/);
|
||||
if (parts.length > 1) {
|
||||
const pid = parseInt(parts[1], 10);
|
||||
// SECURITY: Validate PID is positive integer before adding to list
|
||||
if (!isNaN(pid) && Number.isInteger(pid) && pid > 0) {
|
||||
pids.push(pid);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Unix: Use ps aux | grep
|
||||
const { stdout } = await execAsync('ps aux | grep "chroma-mcp" | grep -v grep || true');
|
||||
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Unix)');
|
||||
return;
|
||||
}
|
||||
|
||||
const lines = stdout.trim().split('\n');
|
||||
for (const line of lines) {
|
||||
const parts = line.trim().split(/\s+/);
|
||||
if (parts.length > 1) {
|
||||
const pid = parseInt(parts[1], 10);
|
||||
// SECURITY: Validate PID is positive integer before adding to list
|
||||
if (!isNaN(pid) && Number.isInteger(pid) && pid > 0) {
|
||||
pids.push(pid);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (pids.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Cleaning up orphaned chroma-mcp processes', {
|
||||
platform: isWindows ? 'Windows' : 'Unix',
|
||||
count: pids.length,
|
||||
pids
|
||||
});
|
||||
|
||||
// Kill all found processes
|
||||
if (isWindows) {
|
||||
for (const pid of pids) {
|
||||
// SECURITY: Double-check PID validation before using in taskkill command
|
||||
if (!Number.isInteger(pid) || pid <= 0) {
|
||||
logger.warn('SYSTEM', 'Skipping invalid PID', { pid });
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
execSync(`taskkill /PID ${pid} /T /F`, { timeout: 5000, stdio: 'ignore' });
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', 'Failed to kill orphaned process', { pid }, error as Error);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
await execAsync(`kill ${pids.join(' ')}`);
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Orphaned processes cleaned up', { count: pids.length });
|
||||
} catch (error) {
|
||||
// Non-fatal - log and continue
|
||||
logger.warn('SYSTEM', 'Failed to cleanup orphaned processes', {}, error as Error);
|
||||
}
|
||||
|
||||
if (pids.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Cleaning up orphaned chroma-mcp processes', {
|
||||
platform: isWindows ? 'Windows' : 'Unix',
|
||||
count: pids.length,
|
||||
pids
|
||||
});
|
||||
|
||||
// Kill all found processes
|
||||
if (isWindows) {
|
||||
for (const pid of pids) {
|
||||
// SECURITY: Double-check PID validation before using in taskkill command
|
||||
if (!Number.isInteger(pid) || pid <= 0) {
|
||||
logger.warn('SYSTEM', 'Skipping invalid PID', { pid });
|
||||
continue;
|
||||
}
|
||||
execSync(`taskkill /PID ${pid} /T /F`, { timeout: 5000, stdio: 'ignore' });
|
||||
}
|
||||
} else {
|
||||
await execAsync(`kill ${pids.join(' ')}`);
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Orphaned processes cleaned up', { count: pids.length });
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -433,6 +414,16 @@ export class WorkerService {
|
||||
// Clean up any orphaned chroma-mcp processes BEFORE starting our own
|
||||
await this.cleanupOrphanedProcesses();
|
||||
|
||||
// Load mode configuration (must happen before database to set observation types)
|
||||
const { ModeManager } = await import('./domain/ModeManager.js');
|
||||
const { SettingsDefaultsManager } = await import('../shared/SettingsDefaultsManager.js');
|
||||
const { USER_SETTINGS_PATH } = await import('../shared/paths.js');
|
||||
|
||||
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
|
||||
const modeId = settings.CLAUDE_MEM_MODE;
|
||||
ModeManager.getInstance().loadMode(modeId);
|
||||
logger.info('SYSTEM', `Mode loaded: ${modeId}`);
|
||||
|
||||
// Initialize database (once, stays open)
|
||||
await this.dbManager.initialize();
|
||||
|
||||
@@ -538,12 +529,8 @@ export class WorkerService {
|
||||
|
||||
// STEP 4: Close MCP client connection (signals child to exit gracefully)
|
||||
if (this.mcpClient) {
|
||||
try {
|
||||
await this.mcpClient.close();
|
||||
logger.info('SYSTEM', 'MCP client closed');
|
||||
} catch (error) {
|
||||
logger.error('SYSTEM', 'Failed to close MCP client', {}, error as Error);
|
||||
}
|
||||
await this.mcpClient.close();
|
||||
logger.info('SYSTEM', 'MCP client closed');
|
||||
}
|
||||
|
||||
// STEP 5: Close database connection (includes ChromaSync cleanup)
|
||||
@@ -576,18 +563,13 @@ export class WorkerService {
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.ParentProcessId -eq ${parentPid} } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 5000 });
|
||||
return stdout
|
||||
.trim()
|
||||
.split('\n')
|
||||
.map(s => parseInt(s.trim(), 10))
|
||||
.filter(n => !isNaN(n) && Number.isInteger(n) && n > 0); // SECURITY: Validate each PID
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', 'Failed to enumerate child processes', {}, error as Error);
|
||||
return [];
|
||||
}
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.ParentProcessId -eq ${parentPid} } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 5000 });
|
||||
return stdout
|
||||
.trim()
|
||||
.split('\n')
|
||||
.map(s => parseInt(s.trim(), 10))
|
||||
.filter(n => !isNaN(n) && Number.isInteger(n) && n > 0); // SECURITY: Validate each PID
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -600,17 +582,12 @@ export class WorkerService {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
if (process.platform === 'win32') {
|
||||
// /T kills entire process tree, /F forces termination
|
||||
await execAsync(`taskkill /PID ${pid} /T /F`, { timeout: 5000 });
|
||||
logger.info('SYSTEM', 'Killed process', { pid });
|
||||
} else {
|
||||
process.kill(pid, 'SIGKILL');
|
||||
}
|
||||
} catch (error) {
|
||||
// Process may already be dead, which is fine
|
||||
logger.debug('SYSTEM', 'Process already dead or kill failed', { pid });
|
||||
if (process.platform === 'win32') {
|
||||
// /T kills entire process tree, /F forces termination
|
||||
await execAsync(`taskkill /PID ${pid} /T /F`, { timeout: 5000 });
|
||||
logger.info('SYSTEM', 'Killed process', { pid });
|
||||
} else {
|
||||
process.kill(pid, 'SIGKILL');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -622,12 +599,8 @@ export class WorkerService {
|
||||
|
||||
while (Date.now() - start < timeoutMs) {
|
||||
const stillAlive = pids.filter(pid => {
|
||||
try {
|
||||
process.kill(pid, 0); // Signal 0 checks if process exists
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
process.kill(pid, 0); // Signal 0 checks if process exists - throws if dead
|
||||
return true;
|
||||
});
|
||||
|
||||
if (stillAlive.length === 0) {
|
||||
|
||||
@@ -30,10 +30,8 @@ export class DatabaseManager {
|
||||
// Initialize ChromaSync
|
||||
this.chromaSync = new ChromaSync('claude-mem');
|
||||
|
||||
// Start background backfill (fire-and-forget, with error logging)
|
||||
this.chromaSync.ensureBackfilled().catch((error) => {
|
||||
logger.error('DB', 'Chroma backfill failed (non-fatal)', {}, error);
|
||||
});
|
||||
// Start background backfill (fire-and-forget)
|
||||
this.chromaSync.ensureBackfilled();
|
||||
|
||||
logger.info('DB', 'Database initialized');
|
||||
}
|
||||
@@ -44,14 +42,10 @@ export class DatabaseManager {
|
||||
async close(): Promise<void> {
|
||||
// Close ChromaSync first (terminates uvx/python processes)
|
||||
if (this.chromaSync) {
|
||||
try {
|
||||
await this.chromaSync.close();
|
||||
this.chromaSync = null;
|
||||
} catch (error) {
|
||||
logger.error('DB', 'Failed to close ChromaSync', {}, error as Error);
|
||||
}
|
||||
await this.chromaSync.close();
|
||||
this.chromaSync = null;
|
||||
}
|
||||
|
||||
|
||||
if (this.sessionStore) {
|
||||
this.sessionStore.close();
|
||||
this.sessionStore = null;
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
*/
|
||||
|
||||
import { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult } from '../sqlite/types.js';
|
||||
import { TYPE_ICON_MAP, TYPE_WORK_EMOJI_MAP } from '../../constants/observation-metadata.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
// Token estimation constant (matches context-generator)
|
||||
const CHARS_PER_TOKEN_ESTIMATE = 4;
|
||||
@@ -55,10 +55,10 @@ Tips:
|
||||
formatObservationIndex(obs: ObservationSearchResult, _index: number): string {
|
||||
const id = `#${obs.id}`;
|
||||
const time = this.formatTime(obs.created_at_epoch);
|
||||
const icon = TYPE_ICON_MAP[obs.type as keyof typeof TYPE_ICON_MAP] || '•';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const title = obs.title || 'Untitled';
|
||||
const readTokens = this.estimateReadTokens(obs);
|
||||
const workEmoji = TYPE_WORK_EMOJI_MAP[obs.type as keyof typeof TYPE_WORK_EMOJI_MAP] || '🔍';
|
||||
const workEmoji = ModeManager.getInstance().getWorkEmoji(obs.type);
|
||||
const workTokens = obs.discovery_tokens || 0;
|
||||
const workDisplay = workTokens > 0 ? `${workEmoji} ${workTokens}` : '-';
|
||||
|
||||
@@ -116,7 +116,7 @@ Tips:
|
||||
formatObservationSearchRow(obs: ObservationSearchResult, lastTime: string): { row: string; time: string } {
|
||||
const id = `#${obs.id}`;
|
||||
const time = this.formatTime(obs.created_at_epoch);
|
||||
const icon = TYPE_ICON_MAP[obs.type as keyof typeof TYPE_ICON_MAP] || '•';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const title = obs.title || 'Untitled';
|
||||
const readTokens = this.estimateReadTokens(obs);
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ The Worker Service is an Express HTTP server that handles all claude-mem operati
|
||||
Hook (plugin/scripts/*-hook.js)
|
||||
→ HTTP Request to Worker (localhost:37777)
|
||||
→ Route Handler (http/routes/*.ts)
|
||||
→ MCP Server Tool (for search) OR Domain Service (for session/data)
|
||||
→ MCP Server Tool (for search) OR Service Layer (for session/data)
|
||||
→ Database (SQLite3 + Chroma vector DB)
|
||||
```
|
||||
|
||||
@@ -22,13 +22,13 @@ src/services/worker/
|
||||
├── WorkerService.ts # Slim orchestrator (~150 lines)
|
||||
├── http/ # HTTP layer
|
||||
│ ├── middleware.ts # Shared middleware (logging, CORS, etc.)
|
||||
│ └── routes/ # Route handlers organized by domain
|
||||
│ └── routes/ # Route handlers organized by feature area
|
||||
│ ├── SessionRoutes.ts # Session lifecycle (init, observations, summarize, complete)
|
||||
│ ├── DataRoutes.ts # Data retrieval (get observations, summaries, prompts, stats)
|
||||
│ ├── SearchRoutes.ts # Search/MCP proxy (all search endpoints)
|
||||
│ ├── SettingsRoutes.ts # Settings, MCP toggle, branch switching
|
||||
│ └── ViewerRoutes.ts # Health check, viewer UI, SSE stream
|
||||
└── domain/ # Business logic (existing services, NO CHANGES in Phase 1)
|
||||
└── services/ # Business logic services (existing, NO CHANGES in Phase 1)
|
||||
├── DatabaseManager.ts # SQLite connection management
|
||||
├── SessionManager.ts # Session state tracking
|
||||
├── SDKAgent.ts # Claude Agent SDK for observations/summaries
|
||||
@@ -46,7 +46,7 @@ src/services/worker/
|
||||
- `GET /stream` - SSE stream for real-time updates
|
||||
|
||||
### SessionRoutes.ts
|
||||
Session lifecycle operations (use domain services directly):
|
||||
Session lifecycle operations (use service layer directly):
|
||||
- `POST /sessions/init` - Initialize new session
|
||||
- `POST /sessions/:sessionId/observations` - Add tool usage observations
|
||||
- `POST /sessions/:sessionId/summarize` - Trigger session summary
|
||||
@@ -58,7 +58,7 @@ Session lifecycle operations (use domain services directly):
|
||||
- `POST /sessions/claude-id/:claudeId/complete` - Complete by claude_id
|
||||
|
||||
### DataRoutes.ts
|
||||
Data retrieval operations (use domain services directly):
|
||||
Data retrieval operations (use service layer directly):
|
||||
- `GET /observations` - List observations (paginated)
|
||||
- `GET /summaries` - List session summaries (paginated)
|
||||
- `GET /prompts` - List user prompts (paginated)
|
||||
@@ -91,7 +91,7 @@ All search operations (proxy to MCP server):
|
||||
- `GET /search/help` - Search help
|
||||
|
||||
### SettingsRoutes.ts
|
||||
Settings and configuration (use domain services directly):
|
||||
Settings and configuration (use service layer directly):
|
||||
- `GET /settings` - Get user settings
|
||||
- `POST /settings` - Update user settings
|
||||
- `GET /mcp/status` - Get MCP server status
|
||||
@@ -109,14 +109,14 @@ Settings and configuration (use domain services directly):
|
||||
|
||||
**MCP vs Direct DB Split** (inherited, not changed in Phase 1):
|
||||
- Search operations → MCP server (mem-search)
|
||||
- Session/data operations → Direct DB access via domain services
|
||||
- Session/data operations → Direct DB access via service layer
|
||||
|
||||
## Future Phase 2
|
||||
|
||||
Phase 2 will unify the architecture:
|
||||
1. Expand MCP server to handle ALL operations (not just search)
|
||||
2. Convert all route handlers to proxy through MCP
|
||||
3. Move database logic from domain services into MCP tools
|
||||
3. Move database logic from service layer into MCP tools
|
||||
4. Result: Worker becomes pure HTTP → MCP proxy for maximum portability
|
||||
|
||||
This separation allows the worker to be deployed anywhere (as a CLI tool, cloud service, etc.) without carrying database dependencies.
|
||||
@@ -126,7 +126,7 @@ This separation allows the worker to be deployed anywhere (as a CLI tool, cloud
|
||||
1. Choose the appropriate route file based on the endpoint's purpose
|
||||
2. Add the route handler method to the class
|
||||
3. Register the route in the `setupRoutes()` method
|
||||
4. Import any needed domain services in the constructor
|
||||
4. Import any needed services in the constructor
|
||||
5. Follow the existing patterns for error handling and logging
|
||||
|
||||
Example:
|
||||
@@ -149,7 +149,7 @@ app.get('/foo', this.handleGetFoo.bind(this));
|
||||
## Key Design Principles
|
||||
|
||||
1. **Progressive Disclosure**: Navigate from high-level (WorkerService.ts) to specific routes to implementation details
|
||||
2. **Single Responsibility**: Each route class handles one domain area
|
||||
2. **Single Responsibility**: Each route class handles one feature area
|
||||
3. **Dependency Injection**: Route classes receive only the services they need
|
||||
4. **Consistent Error Handling**: All handlers use try/catch with logger.failure()
|
||||
5. **Bound Methods**: All route handlers use `.bind(this)` to preserve context
|
||||
|
||||
@@ -19,6 +19,7 @@ import { buildInitPrompt, buildObservationPrompt, buildSummaryPrompt, buildConti
|
||||
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
|
||||
import { USER_SETTINGS_PATH } from '../../shared/paths.js';
|
||||
import type { ActiveSession, SDKUserMessage, PendingMessage } from '../worker-types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
// Import Agent SDK (assumes it's installed)
|
||||
// @ts-ignore - Agent SDK types may not be available
|
||||
@@ -185,6 +186,9 @@ export class SDKAgent {
|
||||
* - We just use the session_id we're given - simple and reliable
|
||||
*/
|
||||
private async *createMessageGenerator(session: ActiveSession): AsyncIterableIterator<SDKUserMessage> {
|
||||
// Load active mode
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
|
||||
// Yield initial user prompt with context (or continuation if prompt #2+)
|
||||
// CRITICAL: Both paths use session.claudeSessionId from the hook
|
||||
yield {
|
||||
@@ -192,8 +196,8 @@ export class SDKAgent {
|
||||
message: {
|
||||
role: 'user',
|
||||
content: session.lastPromptNumber === 1
|
||||
? buildInitPrompt(session.project, session.claudeSessionId, session.userPrompt)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.claudeSessionId)
|
||||
? buildInitPrompt(session.project, session.claudeSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.claudeSessionId, mode)
|
||||
},
|
||||
session_id: session.claudeSessionId,
|
||||
parent_tool_use_id: null,
|
||||
@@ -237,7 +241,7 @@ export class SDKAgent {
|
||||
user_prompt: session.userPrompt,
|
||||
last_user_message: message.last_user_message || '',
|
||||
last_assistant_message: message.last_assistant_message || ''
|
||||
})
|
||||
}, mode)
|
||||
},
|
||||
session_id: session.claudeSessionId,
|
||||
parent_tool_use_id: null,
|
||||
@@ -276,7 +280,7 @@ export class SDKAgent {
|
||||
concepts: obs.concepts?.length ?? 0
|
||||
});
|
||||
|
||||
// Sync to Chroma with error logging
|
||||
// Sync to Chroma
|
||||
const chromaStart = Date.now();
|
||||
const obsType = obs.type;
|
||||
const obsTitle = obs.title || '(untitled)';
|
||||
@@ -296,13 +300,6 @@ export class SDKAgent {
|
||||
type: obsType,
|
||||
title: obsTitle
|
||||
});
|
||||
}).catch(err => {
|
||||
logger.error('CHROMA', 'Failed to sync observation', {
|
||||
obsId,
|
||||
sessionId: session.sessionDbId,
|
||||
type: obsType,
|
||||
title: obsTitle
|
||||
}, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
@@ -352,7 +349,7 @@ export class SDKAgent {
|
||||
hasNextSteps: !!summary.next_steps
|
||||
});
|
||||
|
||||
// Sync to Chroma with error logging
|
||||
// Sync to Chroma
|
||||
const chromaStart = Date.now();
|
||||
const summaryRequest = summary.request || '(no request)';
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
@@ -370,12 +367,6 @@ export class SDKAgent {
|
||||
duration: `${chromaDuration}ms`,
|
||||
request: summaryRequest
|
||||
});
|
||||
}).catch(err => {
|
||||
logger.error('CHROMA', 'Failed to sync summary', {
|
||||
summaryId,
|
||||
sessionId: session.sessionDbId,
|
||||
request: summaryRequest
|
||||
}, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
|
||||
@@ -53,15 +53,9 @@ export class SSEBroadcaster {
|
||||
|
||||
logger.debug('WORKER', 'SSE broadcast sent', { eventType: event.type, clients: this.sseClients.size });
|
||||
|
||||
// Single-pass write + cleanup
|
||||
// Single-pass write
|
||||
for (const client of this.sseClients) {
|
||||
try {
|
||||
client.write(data);
|
||||
} catch (err) {
|
||||
// Remove failed client immediately
|
||||
this.sseClients.delete(client);
|
||||
logger.debug('WORKER', 'Client removed due to write error');
|
||||
}
|
||||
client.write(data);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -77,10 +71,6 @@ export class SSEBroadcaster {
|
||||
*/
|
||||
private sendToClient(res: Response, event: SSEEvent): void {
|
||||
const data = `data: ${JSON.stringify(event)}\n\n`;
|
||||
try {
|
||||
res.write(data);
|
||||
} catch (err) {
|
||||
this.sseClients.delete(res);
|
||||
}
|
||||
res.write(data);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -15,6 +15,7 @@ import { TimelineService, TimelineItem } from './TimelineService.js';
|
||||
import { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult } from '../sqlite/types.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { formatDate, formatTime, formatDateTime, extractFirstFile, groupByDate, estimateTokens } from '../../shared/timeline-formatting.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
const RECENCY_WINDOW_DAYS = 90;
|
||||
@@ -590,15 +591,7 @@ export class SearchManager {
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
let icon = '•';
|
||||
switch (obs.type) {
|
||||
case 'bugfix': icon = '🔴'; break;
|
||||
case 'feature': icon = '🟣'; break;
|
||||
case 'refactor': icon = '🔄'; break;
|
||||
case 'change': icon = '✅'; break;
|
||||
case 'discovery': icon = '🔵'; break;
|
||||
case 'decision': icon = '🧠'; break;
|
||||
}
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
|
||||
const time = formatTime(item.epoch);
|
||||
const title = obs.title || 'Untitled';
|
||||
@@ -1675,15 +1668,7 @@ export class SearchManager {
|
||||
}
|
||||
|
||||
// Map observation type to emoji
|
||||
let icon = '•';
|
||||
switch (obs.type) {
|
||||
case 'bugfix': icon = '🔴'; break;
|
||||
case 'feature': icon = '🟣'; break;
|
||||
case 'refactor': icon = '🔄'; break;
|
||||
case 'change': icon = '✅'; break;
|
||||
case 'discovery': icon = '🔵'; break;
|
||||
case 'decision': icon = '🧠'; break;
|
||||
}
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
|
||||
const time = formatTime(item.epoch);
|
||||
const title = obs.title || 'Untitled';
|
||||
@@ -1927,15 +1912,7 @@ export class SearchManager {
|
||||
}
|
||||
|
||||
// Map observation type to emoji
|
||||
let icon = '•';
|
||||
switch (obs.type) {
|
||||
case 'bugfix': icon = '🔴'; break;
|
||||
case 'feature': icon = '🟣'; break;
|
||||
case 'refactor': icon = '🔄'; break;
|
||||
case 'change': icon = '✅'; break;
|
||||
case 'discovery': icon = '🔵'; break;
|
||||
case 'decision': icon = '🧠'; break;
|
||||
}
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
|
||||
const time = formatTime(item.epoch);
|
||||
const title = obs.title || 'Untitled';
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
*/
|
||||
|
||||
import { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult } from '../sqlite/types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
/**
|
||||
* Timeline item for unified chronological display
|
||||
@@ -210,15 +211,7 @@ export class TimelineService {
|
||||
* Get icon for observation type
|
||||
*/
|
||||
private getTypeIcon(type: string): string {
|
||||
switch (type) {
|
||||
case 'bugfix': return '🔴';
|
||||
case 'feature': return '🟣';
|
||||
case 'refactor': return '🔄';
|
||||
case 'change': return '✅';
|
||||
case 'discovery': return '🔵';
|
||||
case 'decision': return '🧠';
|
||||
default: return '•';
|
||||
}
|
||||
return ModeManager.getInstance().getTypeIcon(type);
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
* Data Routes
|
||||
*
|
||||
* Handles data retrieval operations: observations, summaries, prompts, stats, processing status.
|
||||
* All endpoints use direct database access via domain services.
|
||||
* All endpoints use direct database access via service layer.
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
|
||||
@@ -51,9 +51,6 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
@@ -102,7 +99,7 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
created_at_epoch: latestPrompt.created_at_epoch
|
||||
});
|
||||
|
||||
// Sync user prompt to Chroma with error logging
|
||||
// Sync user prompt to Chroma
|
||||
const chromaStart = Date.now();
|
||||
const promptText = latestPrompt.prompt_text;
|
||||
this.dbManager.getChromaSync().syncUserPrompt(
|
||||
@@ -122,11 +119,6 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
duration: `${chromaDuration}ms`,
|
||||
prompt: truncatedPrompt
|
||||
});
|
||||
}).catch(err => {
|
||||
logger.error('CHROMA', 'Failed to sync user_prompt', {
|
||||
promptId: latestPrompt.id,
|
||||
sessionId: sessionDbId
|
||||
}, err);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -138,9 +130,6 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
// Clear generator reference when completed
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
@@ -309,26 +298,13 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
}
|
||||
|
||||
// Strip memory tags from tool_input and tool_response
|
||||
let cleanedToolInput = '{}';
|
||||
let cleanedToolResponse = '{}';
|
||||
const cleanedToolInput = tool_input !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_input))
|
||||
: '{}';
|
||||
|
||||
try {
|
||||
cleanedToolInput = tool_input !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_input))
|
||||
: '{}';
|
||||
} catch (error) {
|
||||
logger.debug('SESSION', 'Failed to serialize tool_input', { sessionDbId }, error);
|
||||
cleanedToolInput = '{"error": "Failed to serialize tool_input"}';
|
||||
}
|
||||
|
||||
try {
|
||||
cleanedToolResponse = tool_response !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_response))
|
||||
: '{}';
|
||||
} catch (error) {
|
||||
logger.debug('SESSION', 'Failed to serialize tool_result', { sessionDbId }, error);
|
||||
cleanedToolResponse = '{"error": "Failed to serialize tool_response"}';
|
||||
}
|
||||
const cleanedToolResponse = tool_response !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_response))
|
||||
: '{}';
|
||||
|
||||
// Queue observation
|
||||
this.sessionManager.queueObservation(sessionDbId, {
|
||||
|
||||
@@ -13,12 +13,7 @@ import { getPackageRoot } from '../../../../shared/paths.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
import { SettingsManager } from '../../SettingsManager.js';
|
||||
import { getBranchInfo, switchBranch, pullUpdates } from '../../BranchManager.js';
|
||||
import {
|
||||
OBSERVATION_TYPES,
|
||||
OBSERVATION_CONCEPTS,
|
||||
ObservationType,
|
||||
ObservationConcept
|
||||
} from '../../../../constants/observation-metadata.js';
|
||||
import { ModeManager } from '../../domain/ModeManager.js';
|
||||
import { BaseRouteHandler } from '../BaseRouteHandler.js';
|
||||
import { SettingsDefaultsManager } from '../../../../shared/SettingsDefaultsManager.js';
|
||||
import { clearPortCache } from '../../../../shared/worker-utils.js';
|
||||
@@ -296,25 +291,11 @@ export class SettingsRoutes extends BaseRouteHandler {
|
||||
}
|
||||
}
|
||||
|
||||
// Validate observation types
|
||||
if (settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES) {
|
||||
const types = settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim());
|
||||
for (const type of types) {
|
||||
if (type && !OBSERVATION_TYPES.includes(type as ObservationType)) {
|
||||
return { valid: false, error: `Invalid observation type: ${type}. Valid types: ${OBSERVATION_TYPES.join(', ')}` };
|
||||
}
|
||||
}
|
||||
}
|
||||
// Skip observation types validation - any type string is valid since modes define their own types
|
||||
// The database accepts any TEXT value, and mode-specific validation happens at parse time
|
||||
|
||||
// Validate observation concepts
|
||||
if (settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS) {
|
||||
const concepts = settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim());
|
||||
for (const concept of concepts) {
|
||||
if (concept && !OBSERVATION_CONCEPTS.includes(concept as ObservationConcept)) {
|
||||
return { valid: false, error: `Invalid observation concept: ${concept}. Valid concepts: ${OBSERVATION_CONCEPTS.join(', ')}` };
|
||||
}
|
||||
}
|
||||
}
|
||||
// Skip observation concepts validation - any concept string is valid since modes define their own concepts
|
||||
// The database accepts any TEXT value, and mode-specific validation happens at parse time
|
||||
|
||||
return { valid: true };
|
||||
}
|
||||
@@ -332,25 +313,20 @@ export class SettingsRoutes extends BaseRouteHandler {
|
||||
* Toggle MCP search server (rename .mcp.json <-> .mcp.json.disabled)
|
||||
*/
|
||||
private toggleMcp(enabled: boolean): void {
|
||||
try {
|
||||
const packageRoot = getPackageRoot();
|
||||
const mcpPath = path.join(packageRoot, 'plugin', '.mcp.json');
|
||||
const mcpDisabledPath = path.join(packageRoot, 'plugin', '.mcp.json.disabled');
|
||||
const packageRoot = getPackageRoot();
|
||||
const mcpPath = path.join(packageRoot, 'plugin', '.mcp.json');
|
||||
const mcpDisabledPath = path.join(packageRoot, 'plugin', '.mcp.json.disabled');
|
||||
|
||||
if (enabled && existsSync(mcpDisabledPath)) {
|
||||
// Enable: rename .mcp.json.disabled -> .mcp.json
|
||||
renameSync(mcpDisabledPath, mcpPath);
|
||||
logger.info('WORKER', 'MCP search server enabled');
|
||||
} else if (!enabled && existsSync(mcpPath)) {
|
||||
// Disable: rename .mcp.json -> .mcp.json.disabled
|
||||
renameSync(mcpPath, mcpDisabledPath);
|
||||
logger.info('WORKER', 'MCP search server disabled');
|
||||
} else {
|
||||
logger.debug('WORKER', 'MCP toggle no-op (already in desired state)', { enabled });
|
||||
}
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Failed to toggle MCP', { enabled }, error as Error);
|
||||
throw error;
|
||||
if (enabled && existsSync(mcpDisabledPath)) {
|
||||
// Enable: rename .mcp.json.disabled -> .mcp.json
|
||||
renameSync(mcpDisabledPath, mcpPath);
|
||||
logger.info('WORKER', 'MCP search server enabled');
|
||||
} else if (!enabled && existsSync(mcpPath)) {
|
||||
// Disable: rename .mcp.json -> .mcp.json.disabled
|
||||
renameSync(mcpPath, mcpDisabledPath);
|
||||
logger.info('WORKER', 'MCP search server disabled');
|
||||
} else {
|
||||
logger.debug('WORKER', 'MCP toggle no-op (already in desired state)', { enabled });
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -24,6 +24,10 @@ export class ViewerRoutes extends BaseRouteHandler {
|
||||
}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
// Serve static UI assets (JS, CSS, fonts, etc.)
|
||||
const packageRoot = getPackageRoot();
|
||||
app.use(express.static(path.join(packageRoot, 'ui')));
|
||||
|
||||
app.get('/health', this.handleHealth.bind(this));
|
||||
app.get('/', this.handleViewerUI.bind(this));
|
||||
app.get('/stream', this.handleSSEStream.bind(this));
|
||||
|
||||
@@ -22,6 +22,7 @@ export interface SettingsDefaults {
|
||||
CLAUDE_MEM_LOG_LEVEL: string;
|
||||
CLAUDE_MEM_PYTHON_VERSION: string;
|
||||
CLAUDE_CODE_PATH: string;
|
||||
CLAUDE_MEM_MODE: string;
|
||||
// Token Economics
|
||||
CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS: string;
|
||||
CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS: string;
|
||||
@@ -54,6 +55,7 @@ export class SettingsDefaultsManager {
|
||||
CLAUDE_MEM_LOG_LEVEL: 'INFO',
|
||||
CLAUDE_MEM_PYTHON_VERSION: '3.13',
|
||||
CLAUDE_CODE_PATH: '', // Empty means auto-detect via 'which claude'
|
||||
CLAUDE_MEM_MODE: 'code', // Default mode profile
|
||||
// Token Economics
|
||||
CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS: 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS: 'true',
|
||||
|
||||
@@ -1,16 +0,0 @@
|
||||
import { getWorkerRestartInstructions } from '../utils/error-messages.js';
|
||||
|
||||
/**
|
||||
* Handles fetch errors by providing user-friendly messages for connection issues
|
||||
* @throws Error with helpful message if worker is unreachable, re-throws original otherwise
|
||||
*/
|
||||
export function handleWorkerError(error: any): never {
|
||||
if (error.cause?.code === 'ECONNREFUSED' ||
|
||||
error.code === 'ConnectionRefused' || // Bun-specific error format
|
||||
error.name === 'TimeoutError' ||
|
||||
error.message?.includes('fetch failed') ||
|
||||
error.message?.includes('Unable to connect')) {
|
||||
throw new Error(getWorkerRestartInstructions());
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
+11
-2
@@ -32,6 +32,7 @@ export const ARCHIVES_DIR = join(DATA_DIR, 'archives');
|
||||
export const LOGS_DIR = join(DATA_DIR, 'logs');
|
||||
export const TRASH_DIR = join(DATA_DIR, 'trash');
|
||||
export const BACKUPS_DIR = join(DATA_DIR, 'backups');
|
||||
export const MODES_DIR = join(DATA_DIR, 'modes');
|
||||
export const USER_SETTINGS_PATH = join(DATA_DIR, 'settings.json');
|
||||
export const DB_PATH = join(DATA_DIR, 'claude-mem.db');
|
||||
export const VECTOR_DB_DIR = join(DATA_DIR, 'vector-db');
|
||||
@@ -71,6 +72,14 @@ export function ensureAllDataDirs(): void {
|
||||
ensureDir(LOGS_DIR);
|
||||
ensureDir(TRASH_DIR);
|
||||
ensureDir(BACKUPS_DIR);
|
||||
ensureDir(MODES_DIR);
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure modes directory exists
|
||||
*/
|
||||
export function ensureModesDir(): void {
|
||||
ensureDir(MODES_DIR);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -102,10 +111,10 @@ export function getCurrentProjectName(): string {
|
||||
* Find package root directory
|
||||
*
|
||||
* Works because bundled hooks are in plugin/scripts/,
|
||||
* so package root is always two levels up
|
||||
* so package root is always one level up (the plugin directory)
|
||||
*/
|
||||
export function getPackageRoot(): string {
|
||||
return join(_dirname, '..', '..');
|
||||
return join(_dirname, '..');
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -13,101 +13,52 @@ export function extractLastMessage(
|
||||
stripSystemReminders: boolean = false
|
||||
): string {
|
||||
if (!transcriptPath || !existsSync(transcriptPath)) {
|
||||
logger.happyPathError(
|
||||
'PARSER',
|
||||
'Transcript path missing or file does not exist',
|
||||
undefined,
|
||||
{ transcriptPath, role },
|
||||
''
|
||||
);
|
||||
return '';
|
||||
throw new Error(`Transcript path missing or file does not exist: ${transcriptPath}`);
|
||||
}
|
||||
|
||||
try {
|
||||
const content = readFileSync(transcriptPath, 'utf-8').trim();
|
||||
if (!content) {
|
||||
logger.happyPathError(
|
||||
'PARSER',
|
||||
'Transcript file exists but is empty',
|
||||
undefined,
|
||||
{ transcriptPath, role },
|
||||
''
|
||||
);
|
||||
return '';
|
||||
}
|
||||
const content = readFileSync(transcriptPath, 'utf-8').trim();
|
||||
if (!content) {
|
||||
throw new Error(`Transcript file exists but is empty: ${transcriptPath}`);
|
||||
}
|
||||
|
||||
const lines = content.split('\n');
|
||||
let foundMatchingRole = false;
|
||||
const lines = content.split('\n');
|
||||
let foundMatchingRole = false;
|
||||
|
||||
for (let i = lines.length - 1; i >= 0; i--) {
|
||||
try {
|
||||
const line = JSON.parse(lines[i]);
|
||||
if (line.type === role) {
|
||||
foundMatchingRole = true;
|
||||
for (let i = lines.length - 1; i >= 0; i--) {
|
||||
const line = JSON.parse(lines[i]);
|
||||
if (line.type === role) {
|
||||
foundMatchingRole = true;
|
||||
|
||||
if (line.message?.content) {
|
||||
let text = '';
|
||||
const msgContent = line.message.content;
|
||||
if (line.message?.content) {
|
||||
let text = '';
|
||||
const msgContent = line.message.content;
|
||||
|
||||
if (typeof msgContent === 'string') {
|
||||
text = msgContent;
|
||||
} else if (Array.isArray(msgContent)) {
|
||||
text = msgContent
|
||||
.filter((c: any) => c.type === 'text')
|
||||
.map((c: any) => c.text)
|
||||
.join('\n');
|
||||
} else {
|
||||
// Unknown content format - log error and skip this message
|
||||
logger.error(
|
||||
'PARSER',
|
||||
'Unknown message content format',
|
||||
{
|
||||
role,
|
||||
transcriptPath,
|
||||
contentType: typeof msgContent,
|
||||
content: msgContent
|
||||
},
|
||||
new Error('Message content is neither string nor array')
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (stripSystemReminders) {
|
||||
text = text.replace(/<system-reminder>[\s\S]*?<\/system-reminder>/g, '');
|
||||
text = text.replace(/\n{3,}/g, '\n\n').trim();
|
||||
}
|
||||
|
||||
// Log if we found the role but the text is empty after processing
|
||||
if (!text || text.trim() === '') {
|
||||
logger.happyPathError(
|
||||
'PARSER',
|
||||
'Found message but content is empty after processing',
|
||||
undefined,
|
||||
{ role, transcriptPath, msgContentType: typeof msgContent, stripSystemReminders },
|
||||
''
|
||||
);
|
||||
}
|
||||
|
||||
return text;
|
||||
}
|
||||
if (typeof msgContent === 'string') {
|
||||
text = msgContent;
|
||||
} else if (Array.isArray(msgContent)) {
|
||||
text = msgContent
|
||||
.filter((c: any) => c.type === 'text')
|
||||
.map((c: any) => c.text)
|
||||
.join('\n');
|
||||
} else {
|
||||
// Unknown content format - throw error
|
||||
throw new Error(`Unknown message content format in transcript. Type: ${typeof msgContent}`);
|
||||
}
|
||||
} catch {
|
||||
continue;
|
||||
|
||||
if (stripSystemReminders) {
|
||||
text = text.replace(/<system-reminder>[\s\S]*?<\/system-reminder>/g, '');
|
||||
text = text.replace(/\n{3,}/g, '\n\n').trim();
|
||||
}
|
||||
|
||||
// Return text even if empty - caller decides if that's an error
|
||||
return text;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If we searched the whole transcript and didn't find any message of this role
|
||||
if (!foundMatchingRole) {
|
||||
logger.happyPathError(
|
||||
'PARSER',
|
||||
'No message found for role in transcript',
|
||||
undefined,
|
||||
{ role, transcriptPath, totalLines: lines.length },
|
||||
''
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('HOOK', 'Failed to read transcript', { transcriptPath }, error as Error);
|
||||
// If we searched the whole transcript and didn't find any message of this role
|
||||
if (!foundMatchingRole) {
|
||||
throw new Error(`No message found for role '${role}' in transcript: ${transcriptPath}`);
|
||||
}
|
||||
|
||||
return '';
|
||||
|
||||
+40
-62
@@ -63,55 +63,35 @@ export function clearPortCache(): void {
|
||||
* Changed from /health to /api/readiness to ensure MCP initialization is complete
|
||||
*/
|
||||
async function isWorkerHealthy(): Promise<boolean> {
|
||||
try {
|
||||
const port = getWorkerPort();
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/readiness`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
return response.ok;
|
||||
} catch (error) {
|
||||
logger.debug('SYSTEM', 'Worker readiness check failed', {
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
errorType: error?.constructor?.name
|
||||
});
|
||||
return false;
|
||||
}
|
||||
const port = getWorkerPort();
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/readiness`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
return response.ok;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current plugin version from package.json
|
||||
*/
|
||||
function getPluginVersion(): string | null {
|
||||
try {
|
||||
const packageJsonPath = path.join(MARKETPLACE_ROOT, 'package.json');
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
return packageJson.version;
|
||||
} catch (error) {
|
||||
logger.debug('SYSTEM', 'Failed to read plugin version', {
|
||||
error: error instanceof Error ? error.message : String(error)
|
||||
});
|
||||
return null;
|
||||
}
|
||||
function getPluginVersion(): string {
|
||||
const packageJsonPath = path.join(MARKETPLACE_ROOT, 'package.json');
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
return packageJson.version;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the running worker's version from the API
|
||||
*/
|
||||
async function getWorkerVersion(): Promise<string | null> {
|
||||
try {
|
||||
const port = getWorkerPort();
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/version`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
if (!response.ok) return null;
|
||||
const data = await response.json() as { version: string };
|
||||
return data.version;
|
||||
} catch (error) {
|
||||
logger.debug('SYSTEM', 'Failed to get worker version', {
|
||||
error: error instanceof Error ? error.message : String(error)
|
||||
});
|
||||
return null;
|
||||
async function getWorkerVersion(): Promise<string> {
|
||||
const port = getWorkerPort();
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/version`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to get worker version: ${response.status}`);
|
||||
}
|
||||
const data = await response.json() as { version: string };
|
||||
return data.version;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -122,11 +102,6 @@ async function ensureWorkerVersionMatches(): Promise<void> {
|
||||
const pluginVersion = getPluginVersion();
|
||||
const workerVersion = await getWorkerVersion();
|
||||
|
||||
if (!pluginVersion || !workerVersion) {
|
||||
// Can't determine versions, skip check
|
||||
return;
|
||||
}
|
||||
|
||||
if (pluginVersion !== workerVersion) {
|
||||
logger.info('SYSTEM', 'Worker version mismatch detected - restarting worker', {
|
||||
pluginVersion,
|
||||
@@ -144,11 +119,7 @@ async function ensureWorkerVersionMatches(): Promise<void> {
|
||||
|
||||
// Verify it's healthy
|
||||
if (!await isWorkerHealthy()) {
|
||||
logger.error('SYSTEM', 'Worker failed to restart after version mismatch', {
|
||||
expectedVersion: pluginVersion,
|
||||
runningVersion: workerVersion,
|
||||
port: getWorkerPort()
|
||||
});
|
||||
throw new Error(`Worker failed to restart after version mismatch. Expected ${pluginVersion}, was running ${workerVersion}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -166,15 +137,10 @@ async function startWorker(): Promise<boolean> {
|
||||
mkdirSync(dataDir, { recursive: true });
|
||||
|
||||
if (!existsSync(pm2MigratedMarker)) {
|
||||
try {
|
||||
spawnSync('pm2', ['delete', 'claude-mem-worker'], { stdio: 'ignore' });
|
||||
// Mark migration as complete
|
||||
writeFileSync(pm2MigratedMarker, new Date().toISOString(), 'utf-8');
|
||||
logger.debug('SYSTEM', 'PM2 cleanup completed and marked');
|
||||
} catch {
|
||||
// PM2 not installed or process doesn't exist - still mark as migrated
|
||||
writeFileSync(pm2MigratedMarker, new Date().toISOString(), 'utf-8');
|
||||
}
|
||||
spawnSync('pm2', ['delete', 'claude-mem-worker'], { stdio: 'ignore' });
|
||||
// Mark migration as complete
|
||||
writeFileSync(pm2MigratedMarker, new Date().toISOString(), 'utf-8');
|
||||
logger.debug('SYSTEM', 'PM2 cleanup completed and marked');
|
||||
}
|
||||
|
||||
const port = getWorkerPort();
|
||||
@@ -198,8 +164,16 @@ async function startWorker(): Promise<boolean> {
|
||||
* Also ensures worker version matches plugin version
|
||||
*/
|
||||
export async function ensureWorkerRunning(): Promise<void> {
|
||||
// Check if already healthy
|
||||
if (await isWorkerHealthy()) {
|
||||
// Check if already healthy (will throw on fetch errors)
|
||||
let healthy = false;
|
||||
try {
|
||||
healthy = await isWorkerHealthy();
|
||||
} catch (error) {
|
||||
// Worker not running or unreachable - continue to start it
|
||||
healthy = false;
|
||||
}
|
||||
|
||||
if (healthy) {
|
||||
// Worker is healthy, but check if version matches
|
||||
await ensureWorkerVersionMatches();
|
||||
return;
|
||||
@@ -222,9 +196,13 @@ export async function ensureWorkerRunning(): Promise<void> {
|
||||
// Try up to 5 times with 500ms delays (2.5 seconds total)
|
||||
for (let i = 0; i < 5; i++) {
|
||||
await new Promise(resolve => setTimeout(resolve, 500));
|
||||
if (await isWorkerHealthy()) {
|
||||
await ensureWorkerVersionMatches();
|
||||
return;
|
||||
try {
|
||||
if (await isWorkerHealthy()) {
|
||||
await ensureWorkerVersionMatches();
|
||||
return;
|
||||
}
|
||||
} catch (error) {
|
||||
// Continue trying
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
+56
-60
@@ -98,68 +98,64 @@ class Logger {
|
||||
formatTool(toolName: string, toolInput?: any): string {
|
||||
if (!toolInput) return toolName;
|
||||
|
||||
try {
|
||||
const input = typeof toolInput === 'string' ? JSON.parse(toolInput) : toolInput;
|
||||
const input = typeof toolInput === 'string' ? JSON.parse(toolInput) : toolInput;
|
||||
|
||||
// Bash: show full command
|
||||
if (toolName === 'Bash' && input.command) {
|
||||
return `${toolName}(${input.command})`;
|
||||
}
|
||||
|
||||
// File operations: show full path
|
||||
if (input.file_path) {
|
||||
return `${toolName}(${input.file_path})`;
|
||||
}
|
||||
|
||||
// NotebookEdit: show full notebook path
|
||||
if (input.notebook_path) {
|
||||
return `${toolName}(${input.notebook_path})`;
|
||||
}
|
||||
|
||||
// Glob: show full pattern
|
||||
if (toolName === 'Glob' && input.pattern) {
|
||||
return `${toolName}(${input.pattern})`;
|
||||
}
|
||||
|
||||
// Grep: show full pattern
|
||||
if (toolName === 'Grep' && input.pattern) {
|
||||
return `${toolName}(${input.pattern})`;
|
||||
}
|
||||
|
||||
// WebFetch/WebSearch: show full URL or query
|
||||
if (input.url) {
|
||||
return `${toolName}(${input.url})`;
|
||||
}
|
||||
|
||||
if (input.query) {
|
||||
return `${toolName}(${input.query})`;
|
||||
}
|
||||
|
||||
// Task: show subagent_type or full description
|
||||
if (toolName === 'Task') {
|
||||
if (input.subagent_type) {
|
||||
return `${toolName}(${input.subagent_type})`;
|
||||
}
|
||||
if (input.description) {
|
||||
return `${toolName}(${input.description})`;
|
||||
}
|
||||
}
|
||||
|
||||
// Skill: show skill name
|
||||
if (toolName === 'Skill' && input.skill) {
|
||||
return `${toolName}(${input.skill})`;
|
||||
}
|
||||
|
||||
// LSP: show operation type
|
||||
if (toolName === 'LSP' && input.operation) {
|
||||
return `${toolName}(${input.operation})`;
|
||||
}
|
||||
|
||||
// Default: just show tool name
|
||||
return toolName;
|
||||
} catch {
|
||||
return toolName;
|
||||
// Bash: show full command
|
||||
if (toolName === 'Bash' && input.command) {
|
||||
return `${toolName}(${input.command})`;
|
||||
}
|
||||
|
||||
// File operations: show full path
|
||||
if (input.file_path) {
|
||||
return `${toolName}(${input.file_path})`;
|
||||
}
|
||||
|
||||
// NotebookEdit: show full notebook path
|
||||
if (input.notebook_path) {
|
||||
return `${toolName}(${input.notebook_path})`;
|
||||
}
|
||||
|
||||
// Glob: show full pattern
|
||||
if (toolName === 'Glob' && input.pattern) {
|
||||
return `${toolName}(${input.pattern})`;
|
||||
}
|
||||
|
||||
// Grep: show full pattern
|
||||
if (toolName === 'Grep' && input.pattern) {
|
||||
return `${toolName}(${input.pattern})`;
|
||||
}
|
||||
|
||||
// WebFetch/WebSearch: show full URL or query
|
||||
if (input.url) {
|
||||
return `${toolName}(${input.url})`;
|
||||
}
|
||||
|
||||
if (input.query) {
|
||||
return `${toolName}(${input.query})`;
|
||||
}
|
||||
|
||||
// Task: show subagent_type or full description
|
||||
if (toolName === 'Task') {
|
||||
if (input.subagent_type) {
|
||||
return `${toolName}(${input.subagent_type})`;
|
||||
}
|
||||
if (input.description) {
|
||||
return `${toolName}(${input.description})`;
|
||||
}
|
||||
}
|
||||
|
||||
// Skill: show skill name
|
||||
if (toolName === 'Skill' && input.skill) {
|
||||
return `${toolName}(${input.skill})`;
|
||||
}
|
||||
|
||||
// LSP: show operation type
|
||||
if (toolName === 'LSP' && input.operation) {
|
||||
return `${toolName}(${input.operation})`;
|
||||
}
|
||||
|
||||
// Default: just show tool name
|
||||
return toolName;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
Reference in New Issue
Block a user