refactor: Organize worker into clean route-based HTTP architecture
Major architectural improvements to the worker service: - Extracted monolithic WorkerService (~1900 lines) into organized route classes - New HTTP layer with dedicated route handlers: - SessionRoutes: Session lifecycle operations - DataRoutes: Data retrieval endpoints - SearchRoutes: Search/MCP proxy operations - SettingsRoutes: Settings and configuration - ViewerRoutes: Health, UI, and SSE streaming - Added comprehensive README documenting worker architecture - Improved build script to handle worker service compilation - Added context-generator for hook context operations This is Phase 1 of worker refactoring - pure code reorganization with zero functional changes. All existing behavior preserved while improving maintainability and code organization. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@@ -31,6 +31,11 @@ const SEARCH_SERVER = {
|
||||
source: 'src/servers/search-server.ts'
|
||||
};
|
||||
|
||||
const CONTEXT_GENERATOR = {
|
||||
name: 'context-generator',
|
||||
source: 'src/services/context-generator.ts'
|
||||
};
|
||||
|
||||
async function buildHooks() {
|
||||
console.log('🔨 Building claude-mem hooks and worker service...\n');
|
||||
|
||||
@@ -117,6 +122,26 @@ async function buildHooks() {
|
||||
const searchServerStats = fs.statSync(`${hooksDir}/${SEARCH_SERVER.name}.cjs`);
|
||||
console.log(`✓ search-server built (${(searchServerStats.size / 1024).toFixed(2)} KB)`);
|
||||
|
||||
// Build context generator
|
||||
console.log(`\n🔧 Building context generator...`);
|
||||
await build({
|
||||
entryPoints: [CONTEXT_GENERATOR.source],
|
||||
bundle: true,
|
||||
platform: 'node',
|
||||
target: 'node18',
|
||||
format: 'cjs',
|
||||
outfile: `${hooksDir}/${CONTEXT_GENERATOR.name}.cjs`,
|
||||
minify: true,
|
||||
logLevel: 'error',
|
||||
external: ['better-sqlite3'],
|
||||
define: {
|
||||
'__DEFAULT_PACKAGE_VERSION__': `"${version}"`
|
||||
}
|
||||
});
|
||||
|
||||
const contextGenStats = fs.statSync(`${hooksDir}/${CONTEXT_GENERATOR.name}.cjs`);
|
||||
console.log(`✓ context-generator built (${(contextGenStats.size / 1024).toFixed(2)} KB)`);
|
||||
|
||||
// Build each hook
|
||||
for (const hook of HOOKS) {
|
||||
console.log(`\n🔧 Building ${hook.name}...`);
|
||||
|
||||
+49
-1788
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,155 @@
|
||||
# Worker Service Architecture
|
||||
|
||||
## Overview
|
||||
|
||||
The Worker Service is an Express HTTP server that handles all claude-mem operations. It runs on port 37777 (configurable via `CLAUDE_MEM_WORKER_PORT`) and is managed by PM2.
|
||||
|
||||
## Request Flow
|
||||
|
||||
```
|
||||
Hook (plugin/scripts/*-hook.js)
|
||||
→ HTTP Request to Worker (localhost:37777)
|
||||
→ Route Handler (http/routes/*.ts)
|
||||
→ MCP Server Tool (for search) OR Domain Service (for session/data)
|
||||
→ Database (SQLite3 + Chroma vector DB)
|
||||
```
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
src/services/worker/
|
||||
├── README.md # This file
|
||||
├── WorkerService.ts # Slim orchestrator (~150 lines)
|
||||
├── http/ # HTTP layer
|
||||
│ ├── middleware.ts # Shared middleware (logging, CORS, etc.)
|
||||
│ └── routes/ # Route handlers organized by domain
|
||||
│ ├── SessionRoutes.ts # Session lifecycle (init, observations, summarize, complete)
|
||||
│ ├── DataRoutes.ts # Data retrieval (get observations, summaries, prompts, stats)
|
||||
│ ├── SearchRoutes.ts # Search/MCP proxy (all search endpoints)
|
||||
│ ├── SettingsRoutes.ts # Settings, MCP toggle, branch switching
|
||||
│ └── ViewerRoutes.ts # Health check, viewer UI, SSE stream
|
||||
└── domain/ # Business logic (existing services, NO CHANGES in Phase 1)
|
||||
├── DatabaseManager.ts # SQLite connection management
|
||||
├── SessionManager.ts # Session state tracking
|
||||
├── SDKAgent.ts # Claude Agent SDK for observations/summaries
|
||||
├── SSEBroadcaster.ts # Server-Sent Events for real-time updates
|
||||
├── PaginationHelper.ts # Query pagination utilities
|
||||
├── SettingsManager.ts # User settings CRUD
|
||||
└── BranchManager.ts # Git branch operations
|
||||
```
|
||||
|
||||
## Route Organization
|
||||
|
||||
### ViewerRoutes.ts
|
||||
- `GET /health` - Health check endpoint
|
||||
- `GET /` - Serve viewer UI (React app)
|
||||
- `GET /stream` - SSE stream for real-time updates
|
||||
|
||||
### SessionRoutes.ts
|
||||
Session lifecycle operations (use domain services directly):
|
||||
- `POST /sessions/init` - Initialize new session
|
||||
- `POST /sessions/:sessionId/observations` - Add tool usage observations
|
||||
- `POST /sessions/:sessionId/summarize` - Trigger session summary
|
||||
- `GET /sessions/:sessionId/status` - Get session status
|
||||
- `DELETE /sessions/:sessionId` - Delete session
|
||||
- `POST /sessions/:sessionId/complete` - Mark session complete
|
||||
- `POST /sessions/claude-id/:claudeId/observations` - Add observations by claude_id
|
||||
- `POST /sessions/claude-id/:claudeId/summarize` - Summarize by claude_id
|
||||
- `POST /sessions/claude-id/:claudeId/complete` - Complete by claude_id
|
||||
|
||||
### DataRoutes.ts
|
||||
Data retrieval operations (use domain services directly):
|
||||
- `GET /observations` - List observations (paginated)
|
||||
- `GET /summaries` - List session summaries (paginated)
|
||||
- `GET /prompts` - List user prompts (paginated)
|
||||
- `GET /observations/:id` - Get observation by ID
|
||||
- `GET /sessions/:sessionId` - Get session by ID
|
||||
- `GET /prompts/:id` - Get prompt by ID
|
||||
- `GET /stats` - Get database statistics
|
||||
- `GET /projects` - List all projects
|
||||
- `GET /processing` - Get processing status
|
||||
- `POST /processing` - Set processing status
|
||||
|
||||
### SearchRoutes.ts
|
||||
All search operations (proxy to MCP server):
|
||||
- `GET /search` - Unified search (observations + sessions + prompts)
|
||||
- `GET /timeline` - Unified timeline context
|
||||
- `GET /decisions` - Decision-type observations
|
||||
- `GET /changes` - Change-related observations
|
||||
- `GET /how-it-works` - How-it-works explanations
|
||||
- `GET /search/observations` - Search observations
|
||||
- `GET /search/sessions` - Search sessions
|
||||
- `GET /search/prompts` - Search prompts
|
||||
- `GET /search/by-concept` - Find by concept tag
|
||||
- `GET /search/by-file` - Find by file path
|
||||
- `GET /search/by-type` - Find by observation type
|
||||
- `GET /search/recent-context` - Get recent context
|
||||
- `GET /search/context-timeline` - Get context timeline
|
||||
- `GET /context/preview` - Preview context
|
||||
- `GET /context/inject` - Inject context
|
||||
- `GET /search/timeline-by-query` - Timeline by search query
|
||||
- `GET /search/help` - Search help
|
||||
|
||||
### SettingsRoutes.ts
|
||||
Settings and configuration (use domain services directly):
|
||||
- `GET /settings` - Get user settings
|
||||
- `POST /settings` - Update user settings
|
||||
- `GET /mcp/status` - Get MCP server status
|
||||
- `POST /mcp/toggle` - Toggle MCP server on/off
|
||||
- `GET /branch/status` - Get git branch info
|
||||
- `POST /branch/switch` - Switch git branch
|
||||
- `POST /branch/update` - Pull branch updates
|
||||
|
||||
## Current State (Phase 1)
|
||||
|
||||
**Phase 1** is a pure code reorganization with ZERO functional changes:
|
||||
- Extract route handlers from WorkerService.ts monolith
|
||||
- Organize into logical route classes
|
||||
- Keep all existing behavior identical
|
||||
|
||||
**MCP vs Direct DB Split** (inherited, not changed in Phase 1):
|
||||
- Search operations → MCP server (claude-mem-search)
|
||||
- Session/data operations → Direct DB access via domain services
|
||||
|
||||
## Future Phase 2
|
||||
|
||||
Phase 2 will unify the architecture:
|
||||
1. Expand MCP server to handle ALL operations (not just search)
|
||||
2. Convert all route handlers to proxy through MCP
|
||||
3. Move database logic from domain services into MCP tools
|
||||
4. Result: Worker becomes pure HTTP → MCP proxy for maximum portability
|
||||
|
||||
This separation allows the worker to be deployed anywhere (as a CLI tool, cloud service, etc.) without carrying database dependencies.
|
||||
|
||||
## Adding New Endpoints
|
||||
|
||||
1. Choose the appropriate route file based on the endpoint's purpose
|
||||
2. Add the route handler method to the class
|
||||
3. Register the route in the `setupRoutes()` method
|
||||
4. Import any needed domain services in the constructor
|
||||
5. Follow the existing patterns for error handling and logging
|
||||
|
||||
Example:
|
||||
```typescript
|
||||
// In DataRoutes.ts
|
||||
private async handleGetFoo(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.dbManager.getFoo();
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get foo failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
// Register in setupRoutes()
|
||||
app.get('/foo', this.handleGetFoo.bind(this));
|
||||
```
|
||||
|
||||
## Key Design Principles
|
||||
|
||||
1. **Progressive Disclosure**: Navigate from high-level (WorkerService.ts) to specific routes to implementation details
|
||||
2. **Single Responsibility**: Each route class handles one domain area
|
||||
3. **Dependency Injection**: Route classes receive only the services they need
|
||||
4. **Consistent Error Handling**: All handlers use try/catch with logger.failure()
|
||||
5. **Bound Methods**: All route handlers use `.bind(this)` to preserve context
|
||||
@@ -0,0 +1,89 @@
|
||||
/**
|
||||
* HTTP Middleware for Worker Service
|
||||
*
|
||||
* Extracted from WorkerService.ts for better organization.
|
||||
* Handles request/response logging, CORS, JSON parsing, and static file serving.
|
||||
*/
|
||||
|
||||
import express, { Request, Response, NextFunction, RequestHandler } from 'express';
|
||||
import cors from 'cors';
|
||||
import path from 'path';
|
||||
import { getPackageRoot } from '../../../shared/paths.js';
|
||||
import { logger } from '../../../utils/logger.js';
|
||||
|
||||
/**
|
||||
* Create all middleware for the worker service
|
||||
* @param summarizeRequestBody - Function to summarize request bodies for logging
|
||||
* @returns Array of middleware functions
|
||||
*/
|
||||
export function createMiddleware(
|
||||
summarizeRequestBody: (method: string, path: string, body: any) => string
|
||||
): RequestHandler[] {
|
||||
const middlewares: RequestHandler[] = [];
|
||||
|
||||
// JSON parsing with 50mb limit
|
||||
middlewares.push(express.json({ limit: '50mb' }));
|
||||
|
||||
// CORS
|
||||
middlewares.push(cors());
|
||||
|
||||
// HTTP request/response logging
|
||||
middlewares.push((req: Request, res: Response, next: NextFunction) => {
|
||||
// Skip logging for static assets and health checks
|
||||
if (req.path.startsWith('/health') || req.path === '/' || req.path.includes('.')) {
|
||||
return next();
|
||||
}
|
||||
|
||||
const start = Date.now();
|
||||
const requestId = `${req.method}-${Date.now()}`;
|
||||
|
||||
// Log incoming request with body summary
|
||||
const bodySummary = summarizeRequestBody(req.method, req.path, req.body);
|
||||
logger.info('HTTP', `→ ${req.method} ${req.path}`, { requestId }, bodySummary);
|
||||
|
||||
// Capture response
|
||||
const originalSend = res.send.bind(res);
|
||||
res.send = function(body: any) {
|
||||
const duration = Date.now() - start;
|
||||
logger.info('HTTP', `← ${res.statusCode} ${req.path}`, { requestId, duration: `${duration}ms` });
|
||||
return originalSend(body);
|
||||
};
|
||||
|
||||
next();
|
||||
});
|
||||
|
||||
// Serve static files for web UI (viewer-bundle.js, logos, fonts, etc.)
|
||||
const packageRoot = getPackageRoot();
|
||||
const uiDir = path.join(packageRoot, 'plugin', 'ui');
|
||||
middlewares.push(express.static(uiDir));
|
||||
|
||||
return middlewares;
|
||||
}
|
||||
|
||||
/**
|
||||
* Summarize request body for logging
|
||||
* Used to avoid logging sensitive data or large payloads
|
||||
*/
|
||||
export function summarizeRequestBody(method: string, path: string, body: any): string {
|
||||
if (!body || Object.keys(body).length === 0) return '';
|
||||
|
||||
// Session init
|
||||
if (path.includes('/init')) {
|
||||
return '';
|
||||
}
|
||||
|
||||
// Observations
|
||||
if (path.includes('/observations')) {
|
||||
const toolName = body.tool_name || '?';
|
||||
const toolInput = body.tool_input;
|
||||
const toolSummary = logger.formatTool(toolName, toolInput);
|
||||
return `tool=${toolSummary}`;
|
||||
}
|
||||
|
||||
// Summarize request
|
||||
if (path.includes('/summarize')) {
|
||||
return 'requesting summary';
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
@@ -0,0 +1,292 @@
|
||||
/**
|
||||
* Data Routes
|
||||
*
|
||||
* Handles data retrieval operations: observations, summaries, prompts, stats, processing status.
|
||||
* All endpoints use direct database access via domain services.
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import path from 'path';
|
||||
import { readFileSync, statSync, existsSync } from 'fs';
|
||||
import { homedir } from 'os';
|
||||
import { getPackageRoot } from '../../../../shared/paths.js';
|
||||
import { getWorkerPort } from '../../../../shared/worker-utils.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
import { PaginationHelper } from '../../PaginationHelper.js';
|
||||
import { DatabaseManager } from '../../DatabaseManager.js';
|
||||
import { SessionManager } from '../../SessionManager.js';
|
||||
import { SSEBroadcaster } from '../../SSEBroadcaster.js';
|
||||
import type { WorkerService } from '../../../worker-service.js';
|
||||
|
||||
export class DataRoutes {
|
||||
constructor(
|
||||
private paginationHelper: PaginationHelper,
|
||||
private dbManager: DatabaseManager,
|
||||
private sessionManager: SessionManager,
|
||||
private sseBroadcaster: SSEBroadcaster,
|
||||
private workerService: WorkerService,
|
||||
private startTime: number
|
||||
) {}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
// Pagination endpoints
|
||||
app.get('/api/observations', this.handleGetObservations.bind(this));
|
||||
app.get('/api/summaries', this.handleGetSummaries.bind(this));
|
||||
app.get('/api/prompts', this.handleGetPrompts.bind(this));
|
||||
|
||||
// Fetch by ID endpoints
|
||||
app.get('/api/observation/:id', this.handleGetObservationById.bind(this));
|
||||
app.get('/api/session/:id', this.handleGetSessionById.bind(this));
|
||||
app.get('/api/prompt/:id', this.handleGetPromptById.bind(this));
|
||||
|
||||
// Metadata endpoints
|
||||
app.get('/api/stats', this.handleGetStats.bind(this));
|
||||
app.get('/api/projects', this.handleGetProjects.bind(this));
|
||||
|
||||
// Processing status endpoints
|
||||
app.get('/api/processing-status', this.handleGetProcessingStatus.bind(this));
|
||||
app.post('/api/processing', this.handleSetProcessing.bind(this));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get paginated observations
|
||||
*/
|
||||
private handleGetObservations(req: Request, res: Response): void {
|
||||
try {
|
||||
const { offset, limit, project } = this.parsePaginationParams(req);
|
||||
const result = this.paginationHelper.getObservations(offset, limit, project);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get observations failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get paginated summaries
|
||||
*/
|
||||
private handleGetSummaries(req: Request, res: Response): void {
|
||||
try {
|
||||
const { offset, limit, project } = this.parsePaginationParams(req);
|
||||
const result = this.paginationHelper.getSummaries(offset, limit, project);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get summaries failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get paginated user prompts
|
||||
*/
|
||||
private handleGetPrompts(req: Request, res: Response): void {
|
||||
try {
|
||||
const { offset, limit, project } = this.parsePaginationParams(req);
|
||||
const result = this.paginationHelper.getPrompts(offset, limit, project);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get prompts failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get observation by ID
|
||||
* GET /api/observation/:id
|
||||
*/
|
||||
private handleGetObservationById(req: Request, res: Response): void {
|
||||
try {
|
||||
const id = parseInt(req.params.id, 10);
|
||||
if (isNaN(id)) {
|
||||
res.status(400).json({ error: 'Invalid observation ID' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
const observation = store.getObservationById(id);
|
||||
|
||||
if (!observation) {
|
||||
res.status(404).json({ error: `Observation #${id} not found` });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json(observation);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get observation by ID failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get session by ID
|
||||
* GET /api/session/:id
|
||||
*/
|
||||
private handleGetSessionById(req: Request, res: Response): void {
|
||||
try {
|
||||
const id = parseInt(req.params.id, 10);
|
||||
if (isNaN(id)) {
|
||||
res.status(400).json({ error: 'Invalid session ID' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
const sessions = store.getSessionSummariesByIds([id]);
|
||||
|
||||
if (sessions.length === 0) {
|
||||
res.status(404).json({ error: `Session #${id} not found` });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json(sessions[0]);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get session by ID failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user prompt by ID
|
||||
* GET /api/prompt/:id
|
||||
*/
|
||||
private handleGetPromptById(req: Request, res: Response): void {
|
||||
try {
|
||||
const id = parseInt(req.params.id, 10);
|
||||
if (isNaN(id)) {
|
||||
res.status(400).json({ error: 'Invalid prompt ID' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
const prompts = store.getUserPromptsByIds([id]);
|
||||
|
||||
if (prompts.length === 0) {
|
||||
res.status(404).json({ error: `Prompt #${id} not found` });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json(prompts[0]);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get prompt by ID failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get database statistics (with worker metadata)
|
||||
*/
|
||||
private handleGetStats(req: Request, res: Response): void {
|
||||
try {
|
||||
const db = this.dbManager.getSessionStore().db;
|
||||
|
||||
// Read version from package.json
|
||||
const packageRoot = getPackageRoot();
|
||||
const packageJsonPath = path.join(packageRoot, 'package.json');
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
const version = packageJson.version;
|
||||
|
||||
// Get database stats
|
||||
const totalObservations = db.prepare('SELECT COUNT(*) as count FROM observations').get() as { count: number };
|
||||
const totalSessions = db.prepare('SELECT COUNT(*) as count FROM sdk_sessions').get() as { count: number };
|
||||
const totalSummaries = db.prepare('SELECT COUNT(*) as count FROM session_summaries').get() as { count: number };
|
||||
|
||||
// Get database file size and path
|
||||
const dbPath = path.join(homedir(), '.claude-mem', 'claude-mem.db');
|
||||
let dbSize = 0;
|
||||
if (existsSync(dbPath)) {
|
||||
dbSize = statSync(dbPath).size;
|
||||
}
|
||||
|
||||
// Worker metadata
|
||||
const uptime = Math.floor((Date.now() - this.startTime) / 1000);
|
||||
const activeSessions = this.sessionManager.getActiveSessionCount();
|
||||
const sseClients = this.sseBroadcaster.getClientCount();
|
||||
|
||||
res.json({
|
||||
worker: {
|
||||
version,
|
||||
uptime,
|
||||
activeSessions,
|
||||
sseClients,
|
||||
port: getWorkerPort()
|
||||
},
|
||||
database: {
|
||||
path: dbPath,
|
||||
size: dbSize,
|
||||
observations: totalObservations.count,
|
||||
sessions: totalSessions.count,
|
||||
summaries: totalSummaries.count
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get stats failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get list of distinct projects from observations
|
||||
* GET /api/projects
|
||||
*/
|
||||
private handleGetProjects(req: Request, res: Response): void {
|
||||
try {
|
||||
const db = this.dbManager.getSessionStore().db;
|
||||
|
||||
const rows = db.prepare(`
|
||||
SELECT DISTINCT project
|
||||
FROM observations
|
||||
WHERE project IS NOT NULL
|
||||
GROUP BY project
|
||||
ORDER BY MAX(created_at_epoch) DESC
|
||||
`).all() as Array<{ project: string }>;
|
||||
|
||||
const projects = rows.map(row => row.project);
|
||||
|
||||
res.json({ projects });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get projects failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current processing status
|
||||
* GET /api/processing-status
|
||||
*/
|
||||
private handleGetProcessingStatus(req: Request, res: Response): void {
|
||||
const isProcessing = this.sessionManager.isAnySessionProcessing();
|
||||
const queueDepth = this.sessionManager.getTotalActiveWork(); // Includes queued + actively processing
|
||||
res.json({ isProcessing, queueDepth });
|
||||
}
|
||||
|
||||
/**
|
||||
* Set processing status (called by hooks)
|
||||
* NOTE: This now broadcasts computed status based on active processing (ignores input)
|
||||
*/
|
||||
private handleSetProcessing(req: Request, res: Response): void {
|
||||
try {
|
||||
// Broadcast current computed status (ignores manual input)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
const isProcessing = this.sessionManager.isAnySessionProcessing();
|
||||
const queueDepth = this.sessionManager.getTotalQueueDepth();
|
||||
const activeSessions = this.sessionManager.getActiveSessionCount();
|
||||
logger.debug('WORKER', 'Processing status broadcast', { isProcessing, queueDepth, activeSessions });
|
||||
|
||||
res.json({ status: 'ok', isProcessing });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Failed to broadcast processing status', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse pagination parameters from request query
|
||||
*/
|
||||
private parsePaginationParams(req: Request): { offset: number; limit: number; project?: string } {
|
||||
const offset = parseInt(req.query.offset as string, 10) || 0;
|
||||
const limit = Math.min(parseInt(req.query.limit as string, 10) || 20, 100); // Max 100
|
||||
const project = req.query.project as string | undefined;
|
||||
|
||||
return { offset, limit, project };
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,491 @@
|
||||
/**
|
||||
* Search Routes
|
||||
*
|
||||
* Handles all search operations by proxying to the MCP search server.
|
||||
* All endpoints call MCP tools via the client connection.
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import path from 'path';
|
||||
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||
import { getPackageRoot } from '../../../../shared/paths.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
|
||||
export class SearchRoutes {
|
||||
constructor(
|
||||
private mcpClient: Client
|
||||
) {}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
// Unified endpoints (new consolidated API)
|
||||
app.get('/api/search', this.handleUnifiedSearch.bind(this));
|
||||
app.get('/api/timeline', this.handleUnifiedTimeline.bind(this));
|
||||
app.get('/api/decisions', this.handleDecisions.bind(this));
|
||||
app.get('/api/changes', this.handleChanges.bind(this));
|
||||
app.get('/api/how-it-works', this.handleHowItWorks.bind(this));
|
||||
|
||||
// Backward compatibility endpoints
|
||||
app.get('/api/search/observations', this.handleSearchObservations.bind(this));
|
||||
app.get('/api/search/sessions', this.handleSearchSessions.bind(this));
|
||||
app.get('/api/search/prompts', this.handleSearchPrompts.bind(this));
|
||||
app.get('/api/search/by-concept', this.handleSearchByConcept.bind(this));
|
||||
app.get('/api/search/by-file', this.handleSearchByFile.bind(this));
|
||||
app.get('/api/search/by-type', this.handleSearchByType.bind(this));
|
||||
|
||||
// Context endpoints
|
||||
app.get('/api/context/recent', this.handleGetRecentContext.bind(this));
|
||||
app.get('/api/context/timeline', this.handleGetContextTimeline.bind(this));
|
||||
app.get('/api/context/preview', this.handleContextPreview.bind(this));
|
||||
app.get('/api/context/inject', this.handleContextInject.bind(this));
|
||||
|
||||
// Timeline and help endpoints
|
||||
app.get('/api/timeline/by-query', this.handleGetTimelineByQuery.bind(this));
|
||||
app.get('/api/search/help', this.handleSearchHelp.bind(this));
|
||||
}
|
||||
|
||||
/**
|
||||
* Unified search (observations + sessions + prompts)
|
||||
* GET /api/search?query=...&type=observations&format=index&limit=20
|
||||
*/
|
||||
private async handleUnifiedSearch(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'search',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Unified search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Unified timeline (anchor or query-based)
|
||||
* GET /api/timeline?anchor=123 OR GET /api/timeline?query=...
|
||||
*/
|
||||
private async handleUnifiedTimeline(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'timeline',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Unified timeline failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Semantic shortcut for finding decision observations
|
||||
* GET /api/decisions?format=index&limit=20
|
||||
*/
|
||||
private async handleDecisions(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'decisions',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Decisions search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Semantic shortcut for finding change-related observations
|
||||
* GET /api/changes?format=index&limit=20
|
||||
*/
|
||||
private async handleChanges(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'changes',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Changes search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Semantic shortcut for finding "how it works" explanations
|
||||
* GET /api/how-it-works?format=index&limit=20
|
||||
*/
|
||||
private async handleHowItWorks(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'how_it_works',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'How it works search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search observations (use /api/search?type=observations instead)
|
||||
* GET /api/search/observations?query=...&format=index&limit=20&project=...
|
||||
*/
|
||||
private async handleSearchObservations(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'search_observations',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search session summaries
|
||||
* GET /api/search/sessions?query=...&format=index&limit=20
|
||||
*/
|
||||
private async handleSearchSessions(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'search_sessions',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search user prompts
|
||||
* GET /api/search/prompts?query=...&format=index&limit=20
|
||||
*/
|
||||
private async handleSearchPrompts(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'search_user_prompts',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search observations by concept
|
||||
* GET /api/search/by-concept?concept=discovery&format=index&limit=5
|
||||
*/
|
||||
private async handleSearchByConcept(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'find_by_concept',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search by file path
|
||||
* GET /api/search/by-file?filePath=...&format=index&limit=10
|
||||
*/
|
||||
private async handleSearchByFile(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'find_by_file',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Search observations by type
|
||||
* GET /api/search/by-type?type=bugfix&format=index&limit=10
|
||||
*/
|
||||
private async handleSearchByType(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'find_by_type',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent context (summaries and observations for a project)
|
||||
* GET /api/context/recent?project=...&limit=3
|
||||
*/
|
||||
private async handleGetRecentContext(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'get_recent_context',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get context timeline around an anchor point
|
||||
* GET /api/context/timeline?anchor=123&depth_before=10&depth_after=10&project=...
|
||||
*/
|
||||
private async handleGetContextTimeline(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'get_context_timeline',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate context preview for settings modal
|
||||
* GET /api/context/preview?project=...
|
||||
*/
|
||||
private async handleContextPreview(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
// Dynamic import to use BUILT context-hook function
|
||||
const packageRoot = getPackageRoot();
|
||||
const contextHookPath = path.join(packageRoot, 'plugin', 'scripts', 'context-hook.js');
|
||||
const { contextHook } = await import(contextHookPath);
|
||||
|
||||
// Get project from query parameter
|
||||
const projectName = req.query.project as string;
|
||||
|
||||
if (!projectName) {
|
||||
return res.status(400).json({ error: 'Project parameter is required' });
|
||||
}
|
||||
|
||||
// Use project name as CWD (contextHook uses path.basename to get project)
|
||||
const cwd = `/preview/${projectName}`;
|
||||
|
||||
// Generate preview context (with colors for terminal display)
|
||||
const contextText = await contextHook(
|
||||
{
|
||||
session_id: 'preview-' + Date.now(),
|
||||
cwd: cwd
|
||||
},
|
||||
true // useColors=true for ANSI terminal output
|
||||
);
|
||||
|
||||
// Return as plain text
|
||||
res.setHeader('Content-Type', 'text/plain; charset=utf-8');
|
||||
res.send(contextText);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Context preview generation failed', {}, error as Error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to generate context preview',
|
||||
message: (error as Error).message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Context injection endpoint for hooks
|
||||
* GET /api/context/inject?project=...&colors=true
|
||||
*
|
||||
* Returns pre-formatted context string ready for display.
|
||||
* Use colors=true for ANSI-colored terminal output.
|
||||
*/
|
||||
private async handleContextInject(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const projectName = req.query.project as string;
|
||||
const useColors = req.query.colors === 'true';
|
||||
|
||||
if (!projectName) {
|
||||
res.status(400).json({ error: 'Project parameter is required' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Import context generator (runs in worker, has access to database)
|
||||
// Note: After bundling, context-generator.cjs is in the same directory as worker-service.cjs
|
||||
const { generateContext } = await import('./context-generator.cjs');
|
||||
|
||||
// Use project name as CWD (generateContext uses path.basename to get project)
|
||||
const cwd = `/context/${projectName}`;
|
||||
|
||||
// Generate context
|
||||
const contextText = await generateContext(
|
||||
{
|
||||
session_id: 'context-inject-' + Date.now(),
|
||||
cwd: cwd
|
||||
},
|
||||
useColors
|
||||
);
|
||||
|
||||
// Return as plain text
|
||||
res.setHeader('Content-Type', 'text/plain; charset=utf-8');
|
||||
res.send(contextText);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Context injection failed', {}, error as Error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to generate context',
|
||||
message: (error as Error).message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get timeline by query (search first, then get timeline around best match)
|
||||
* GET /api/timeline/by-query?query=...&mode=auto&depth_before=10&depth_after=10
|
||||
*/
|
||||
private async handleGetTimelineByQuery(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const result = await this.mcpClient.callTool({
|
||||
name: 'get_timeline_by_query',
|
||||
arguments: req.query
|
||||
});
|
||||
res.json(result.content);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Search failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get search help documentation
|
||||
* GET /api/search/help
|
||||
*/
|
||||
private handleSearchHelp(req: Request, res: Response): void {
|
||||
res.json({
|
||||
title: 'Claude-Mem Search API',
|
||||
description: 'HTTP API for searching persistent memory',
|
||||
endpoints: [
|
||||
{
|
||||
path: '/api/search/observations',
|
||||
method: 'GET',
|
||||
description: 'Search observations using full-text search',
|
||||
parameters: {
|
||||
query: 'Search query (required)',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results (default: 20)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/sessions',
|
||||
method: 'GET',
|
||||
description: 'Search session summaries using full-text search',
|
||||
parameters: {
|
||||
query: 'Search query (required)',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results (default: 20)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/prompts',
|
||||
method: 'GET',
|
||||
description: 'Search user prompts using full-text search',
|
||||
parameters: {
|
||||
query: 'Search query (required)',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results (default: 20)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/by-concept',
|
||||
method: 'GET',
|
||||
description: 'Find observations by concept tag',
|
||||
parameters: {
|
||||
concept: 'Concept tag (required): discovery, decision, bugfix, feature, refactor',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results (default: 10)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/by-file',
|
||||
method: 'GET',
|
||||
description: 'Find observations and sessions by file path',
|
||||
parameters: {
|
||||
filePath: 'File path or partial path (required)',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results per type (default: 10)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/by-type',
|
||||
method: 'GET',
|
||||
description: 'Find observations by type',
|
||||
parameters: {
|
||||
type: 'Observation type (required): discovery, decision, bugfix, feature, refactor',
|
||||
format: 'Response format: "index" or "full" (default: "full")',
|
||||
limit: 'Number of results (default: 10)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/context/recent',
|
||||
method: 'GET',
|
||||
description: 'Get recent session context including summaries and observations',
|
||||
parameters: {
|
||||
project: 'Project name (default: current directory)',
|
||||
limit: 'Number of recent sessions (default: 3)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/context/timeline',
|
||||
method: 'GET',
|
||||
description: 'Get unified timeline around a specific point in time',
|
||||
parameters: {
|
||||
anchor: 'Anchor point: observation ID, session ID (e.g., "S123"), or ISO timestamp (required)',
|
||||
depth_before: 'Number of records before anchor (default: 10)',
|
||||
depth_after: 'Number of records after anchor (default: 10)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/timeline/by-query',
|
||||
method: 'GET',
|
||||
description: 'Search for best match, then get timeline around it',
|
||||
parameters: {
|
||||
query: 'Search query (required)',
|
||||
mode: 'Search mode: "auto", "observations", or "sessions" (default: "auto")',
|
||||
depth_before: 'Number of records before match (default: 10)',
|
||||
depth_after: 'Number of records after match (default: 10)',
|
||||
project: 'Filter by project name (optional)'
|
||||
}
|
||||
},
|
||||
{
|
||||
path: '/api/search/help',
|
||||
method: 'GET',
|
||||
description: 'Get this help documentation'
|
||||
}
|
||||
],
|
||||
examples: [
|
||||
'curl "http://localhost:37777/api/search/observations?query=authentication&format=index&limit=5"',
|
||||
'curl "http://localhost:37777/api/search/by-type?type=bugfix&limit=10"',
|
||||
'curl "http://localhost:37777/api/context/recent?project=claude-mem&limit=3"',
|
||||
'curl "http://localhost:37777/api/context/timeline?anchor=123&depth_before=5&depth_after=5"'
|
||||
]
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,540 @@
|
||||
/**
|
||||
* Session Routes
|
||||
*
|
||||
* Handles session lifecycle operations: initialization, observations, summarization, completion.
|
||||
* These routes manage the flow of work through the Claude Agent SDK.
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import { getWorkerPort } from '../../../../shared/worker-utils.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
import { stripMemoryTagsFromJson } from '../../../../utils/tag-stripping.js';
|
||||
import { SessionManager } from '../../SessionManager.js';
|
||||
import { DatabaseManager } from '../../DatabaseManager.js';
|
||||
import { SDKAgent } from '../../SDKAgent.js';
|
||||
import { SSEBroadcaster } from '../../SSEBroadcaster.js';
|
||||
import type { WorkerService } from '../../../worker-service.js';
|
||||
|
||||
export class SessionRoutes {
|
||||
constructor(
|
||||
private sessionManager: SessionManager,
|
||||
private dbManager: DatabaseManager,
|
||||
private sdkAgent: SDKAgent,
|
||||
private sseBroadcaster: SSEBroadcaster,
|
||||
private workerService: WorkerService
|
||||
) {}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
// Legacy session endpoints (use sessionDbId)
|
||||
app.post('/sessions/:sessionDbId/init', this.handleSessionInit.bind(this));
|
||||
app.post('/sessions/:sessionDbId/observations', this.handleObservations.bind(this));
|
||||
app.post('/sessions/:sessionDbId/summarize', this.handleSummarize.bind(this));
|
||||
app.get('/sessions/:sessionDbId/status', this.handleSessionStatus.bind(this));
|
||||
app.delete('/sessions/:sessionDbId', this.handleSessionDelete.bind(this));
|
||||
app.post('/sessions/:sessionDbId/complete', this.handleSessionComplete.bind(this));
|
||||
|
||||
// New session endpoints (use claudeSessionId)
|
||||
app.post('/api/sessions/observations', this.handleObservationsByClaudeId.bind(this));
|
||||
app.post('/api/sessions/summarize', this.handleSummarizeByClaudeId.bind(this));
|
||||
app.post('/api/sessions/complete', this.handleSessionCompleteByClaudeId.bind(this));
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize a new session
|
||||
*/
|
||||
private handleSessionInit(req: Request, res: Response): void {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { userPrompt, promptNumber } = req.body;
|
||||
const session = this.sessionManager.initializeSession(sessionDbId, userPrompt, promptNumber);
|
||||
|
||||
// Get the latest user_prompt for this session to sync to Chroma
|
||||
const db = this.dbManager.getSessionStore().db;
|
||||
const latestPrompt = db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.sdk_session_id,
|
||||
s.project
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.claude_session_id = ?
|
||||
ORDER BY up.created_at_epoch DESC
|
||||
LIMIT 1
|
||||
`).get(session.claudeSessionId) as any;
|
||||
|
||||
// Broadcast new prompt to SSE clients (for web UI)
|
||||
if (latestPrompt) {
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'new_prompt',
|
||||
prompt: {
|
||||
id: latestPrompt.id,
|
||||
claude_session_id: latestPrompt.claude_session_id,
|
||||
project: latestPrompt.project,
|
||||
prompt_number: latestPrompt.prompt_number,
|
||||
prompt_text: latestPrompt.prompt_text,
|
||||
created_at_epoch: latestPrompt.created_at_epoch
|
||||
}
|
||||
});
|
||||
|
||||
// Start activity indicator immediately when prompt arrives (work is about to begin)
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'processing_status',
|
||||
isProcessing: true
|
||||
});
|
||||
|
||||
// Sync user prompt to Chroma with error logging
|
||||
const chromaStart = Date.now();
|
||||
const promptText = latestPrompt.prompt_text;
|
||||
this.dbManager.getChromaSync().syncUserPrompt(
|
||||
latestPrompt.id,
|
||||
latestPrompt.sdk_session_id,
|
||||
latestPrompt.project,
|
||||
promptText,
|
||||
latestPrompt.prompt_number,
|
||||
latestPrompt.created_at_epoch
|
||||
).then(() => {
|
||||
const chromaDuration = Date.now() - chromaStart;
|
||||
const truncatedPrompt = promptText.length > 60
|
||||
? promptText.substring(0, 60) + '...'
|
||||
: promptText;
|
||||
logger.debug('CHROMA', 'User prompt synced', {
|
||||
promptId: latestPrompt.id,
|
||||
duration: `${chromaDuration}ms`,
|
||||
prompt: truncatedPrompt
|
||||
});
|
||||
}).catch(err => {
|
||||
logger.error('CHROMA', 'Failed to sync user_prompt', {
|
||||
promptId: latestPrompt.id,
|
||||
sessionId: sessionDbId
|
||||
}, err);
|
||||
});
|
||||
}
|
||||
|
||||
// Broadcast processing status (based on queue depth)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Start SDK agent in background (pass worker ref for spinner control)
|
||||
logger.info('SESSION', 'Generator starting', {
|
||||
sessionId: sessionDbId,
|
||||
project: session.project,
|
||||
promptNum: session.lastPromptNumber
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
// Clear generator reference when completed
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
// Broadcast status change (generator finished, may stop spinner)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
});
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'session_started',
|
||||
sessionDbId,
|
||||
project: session.project
|
||||
});
|
||||
|
||||
res.json({ status: 'initialized', sessionDbId, port: getWorkerPort() });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Session init failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Queue observations for processing
|
||||
* CRITICAL: Ensures SDK agent is running to process the queue (ALWAYS SAVE EVERYTHING)
|
||||
*/
|
||||
private handleObservations(req: Request, res: Response): void {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { tool_name, tool_input, tool_response, prompt_number, cwd } = req.body;
|
||||
|
||||
this.sessionManager.queueObservation(sessionDbId, {
|
||||
tool_name,
|
||||
tool_input,
|
||||
tool_response,
|
||||
prompt_number,
|
||||
cwd
|
||||
});
|
||||
|
||||
// CRITICAL: Ensure SDK agent is running to consume the queue
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
if (session && !session.generatorPromise) {
|
||||
logger.info('SESSION', 'Generator auto-starting (observation)', {
|
||||
sessionId: sessionDbId,
|
||||
queueDepth: session.pendingMessages.length
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
// Clear generator reference when completed
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
// Broadcast status change (generator finished, may stop spinner)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
});
|
||||
}
|
||||
|
||||
// Broadcast activity status (queue depth changed)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'observation_queued',
|
||||
sessionDbId
|
||||
});
|
||||
|
||||
res.json({ status: 'queued' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Observation queuing failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Queue summarize request
|
||||
* CRITICAL: Ensures SDK agent is running to process the queue (ALWAYS SAVE EVERYTHING)
|
||||
*/
|
||||
private handleSummarize(req: Request, res: Response): void {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { last_user_message, last_assistant_message } = req.body;
|
||||
|
||||
this.sessionManager.queueSummarize(sessionDbId, last_user_message, last_assistant_message);
|
||||
|
||||
// CRITICAL: Ensure SDK agent is running to consume the queue
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
if (session && !session.generatorPromise) {
|
||||
logger.info('SESSION', 'Generator auto-starting (summarize)', {
|
||||
sessionId: sessionDbId,
|
||||
queueDepth: session.pendingMessages.length
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
// Clear generator reference when completed
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
// Broadcast status change (generator finished, may stop spinner)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
});
|
||||
}
|
||||
|
||||
// Broadcast activity status (queue depth changed)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
res.json({ status: 'queued' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Summarize queuing failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get session status
|
||||
*/
|
||||
private handleSessionStatus(req: Request, res: Response): void {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
|
||||
if (!session) {
|
||||
res.json({ status: 'not_found' });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({
|
||||
status: 'active',
|
||||
sessionDbId,
|
||||
project: session.project,
|
||||
queueLength: session.pendingMessages.length,
|
||||
uptime: Date.now() - session.startTime
|
||||
});
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Session status failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a session
|
||||
*/
|
||||
private async handleSessionDelete(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
await this.sessionManager.deleteSession(sessionDbId);
|
||||
|
||||
// Mark session complete in database
|
||||
this.dbManager.markSessionComplete(sessionDbId);
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'session_completed',
|
||||
sessionDbId
|
||||
});
|
||||
|
||||
res.json({ status: 'deleted' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Session delete failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Complete a session (backward compatibility for cleanup-hook)
|
||||
* cleanup-hook expects POST /sessions/:sessionDbId/complete instead of DELETE
|
||||
*/
|
||||
private async handleSessionComplete(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
if (isNaN(sessionDbId)) {
|
||||
res.status(400).json({ success: false, error: 'Invalid session ID' });
|
||||
return;
|
||||
}
|
||||
|
||||
await this.sessionManager.deleteSession(sessionDbId);
|
||||
|
||||
// Mark session complete in database
|
||||
this.dbManager.markSessionComplete(sessionDbId);
|
||||
|
||||
// Broadcast processing status (based on queue depth)
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'session_completed',
|
||||
timestamp: Date.now(),
|
||||
sessionDbId
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Session complete failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: String(error) });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Queue observations by claudeSessionId (post-tool-use-hook uses this)
|
||||
* POST /api/sessions/observations
|
||||
* Body: { claudeSessionId, tool_name, tool_input, tool_response, cwd }
|
||||
*/
|
||||
private handleObservationsByClaudeId(req: Request, res: Response): void {
|
||||
try {
|
||||
const { claudeSessionId, tool_name, tool_input, tool_response, cwd } = req.body;
|
||||
|
||||
if (!claudeSessionId) {
|
||||
res.status(400).json({ error: 'Missing claudeSessionId' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Get or create session
|
||||
const sessionDbId = store.createSDKSession(claudeSessionId, '', '');
|
||||
const promptNumber = store.getPromptCounter(sessionDbId);
|
||||
|
||||
// Privacy check: skip if user prompt was entirely private
|
||||
const userPrompt = store.getUserPrompt(claudeSessionId, promptNumber);
|
||||
if (!userPrompt || userPrompt.trim() === '') {
|
||||
logger.debug('HOOK', 'Skipping observation - user prompt was entirely private', {
|
||||
sessionId: sessionDbId,
|
||||
promptNumber,
|
||||
tool_name
|
||||
});
|
||||
res.json({ status: 'skipped', reason: 'private' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Strip memory tags from tool_input and tool_response
|
||||
let cleanedToolInput = '{}';
|
||||
let cleanedToolResponse = '{}';
|
||||
|
||||
try {
|
||||
cleanedToolInput = tool_input !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_input))
|
||||
: '{}';
|
||||
} catch (error) {
|
||||
cleanedToolInput = '{"error": "Failed to serialize tool_input"}';
|
||||
}
|
||||
|
||||
try {
|
||||
cleanedToolResponse = tool_response !== undefined
|
||||
? stripMemoryTagsFromJson(JSON.stringify(tool_response))
|
||||
: '{}';
|
||||
} catch (error) {
|
||||
cleanedToolResponse = '{"error": "Failed to serialize tool_response"}';
|
||||
}
|
||||
|
||||
// Queue observation
|
||||
this.sessionManager.queueObservation(sessionDbId, {
|
||||
tool_name,
|
||||
tool_input: cleanedToolInput,
|
||||
tool_response: cleanedToolResponse,
|
||||
prompt_number: promptNumber,
|
||||
cwd: cwd || ''
|
||||
});
|
||||
|
||||
// Ensure SDK agent is running
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
if (session && !session.generatorPromise) {
|
||||
logger.info('SESSION', 'Generator auto-starting (observation)', {
|
||||
sessionId: sessionDbId,
|
||||
queueDepth: session.pendingMessages.length
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
});
|
||||
}
|
||||
|
||||
// Broadcast activity status
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'observation_queued',
|
||||
sessionDbId
|
||||
});
|
||||
|
||||
res.json({ status: 'queued' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Observation by claudeId failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Queue summarize by claudeSessionId (summary-hook uses this)
|
||||
* POST /api/sessions/summarize
|
||||
* Body: { claudeSessionId, last_user_message, last_assistant_message }
|
||||
*
|
||||
* Checks privacy, queues summarize request for SDK agent
|
||||
*/
|
||||
private handleSummarizeByClaudeId(req: Request, res: Response): void {
|
||||
try {
|
||||
const { claudeSessionId, last_user_message, last_assistant_message } = req.body;
|
||||
|
||||
if (!claudeSessionId) {
|
||||
res.status(400).json({ error: 'Missing claudeSessionId' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Get or create session
|
||||
const sessionDbId = store.createSDKSession(claudeSessionId, '', '');
|
||||
const promptNumber = store.getPromptCounter(sessionDbId);
|
||||
|
||||
// Privacy check: skip if user prompt was entirely private
|
||||
const userPrompt = store.getUserPrompt(claudeSessionId, promptNumber);
|
||||
if (!userPrompt || userPrompt.trim() === '') {
|
||||
logger.debug('HOOK', 'Skipping summary - user prompt was entirely private', {
|
||||
sessionId: sessionDbId,
|
||||
promptNumber
|
||||
});
|
||||
res.json({ status: 'skipped', reason: 'private' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Queue summarize
|
||||
this.sessionManager.queueSummarize(sessionDbId, last_user_message || '', last_assistant_message);
|
||||
|
||||
// Ensure SDK agent is running
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
if (session && !session.generatorPromise) {
|
||||
logger.info('SESSION', 'Generator auto-starting (summarize)', {
|
||||
sessionId: sessionDbId,
|
||||
queueDepth: session.pendingMessages.length
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this.workerService)
|
||||
.catch(err => {
|
||||
logger.failure('SDK', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
})
|
||||
.finally(() => {
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: sessionDbId });
|
||||
session.generatorPromise = null;
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
});
|
||||
}
|
||||
|
||||
// Broadcast activity status
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
res.json({ status: 'queued' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Summarize by claudeId failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Complete session by claudeSessionId (cleanup-hook uses this)
|
||||
* POST /api/sessions/complete
|
||||
* Body: { claudeSessionId }
|
||||
*
|
||||
* Marks session complete, stops SDK agent, broadcasts status
|
||||
*/
|
||||
private async handleSessionCompleteByClaudeId(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const { claudeSessionId } = req.body;
|
||||
|
||||
if (!claudeSessionId) {
|
||||
res.status(400).json({ success: false, error: 'Missing claudeSessionId' });
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Find session by claudeSessionId
|
||||
const session = store.findActiveSDKSession(claudeSessionId);
|
||||
if (!session) {
|
||||
// No active session - nothing to clean up (may have already been completed)
|
||||
res.json({ success: true, message: 'No active session found' });
|
||||
return;
|
||||
}
|
||||
|
||||
const sessionDbId = session.id;
|
||||
|
||||
// Delete from session manager (aborts SDK agent)
|
||||
await this.sessionManager.deleteSession(sessionDbId);
|
||||
|
||||
// Mark session complete in database
|
||||
this.dbManager.markSessionComplete(sessionDbId);
|
||||
|
||||
// Broadcast processing status
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Broadcast SSE event
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'session_completed',
|
||||
timestamp: Date.now(),
|
||||
sessionDbId
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Session complete by claudeId failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: String(error) });
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,404 @@
|
||||
/**
|
||||
* Settings Routes
|
||||
*
|
||||
* Handles settings management, MCP toggle, and branch switching.
|
||||
* Settings are stored in ~/.claude-mem/settings.json
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import path from 'path';
|
||||
import { readFileSync, writeFileSync, existsSync, renameSync } from 'fs';
|
||||
import { homedir } from 'os';
|
||||
import { getPackageRoot } from '../../../../shared/paths.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
import { SettingsManager } from '../../SettingsManager.js';
|
||||
import { getBranchInfo, switchBranch, pullUpdates } from '../../BranchManager.js';
|
||||
import {
|
||||
OBSERVATION_TYPES,
|
||||
OBSERVATION_CONCEPTS,
|
||||
DEFAULT_OBSERVATION_TYPES_STRING,
|
||||
DEFAULT_OBSERVATION_CONCEPTS_STRING
|
||||
} from '../../../../constants/observation-metadata.js';
|
||||
|
||||
export class SettingsRoutes {
|
||||
constructor(
|
||||
private settingsManager: SettingsManager
|
||||
) {}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
// Settings endpoints
|
||||
app.get('/api/settings', this.handleGetSettings.bind(this));
|
||||
app.post('/api/settings', this.handleUpdateSettings.bind(this));
|
||||
|
||||
// MCP toggle endpoints
|
||||
app.get('/api/mcp/status', this.handleGetMcpStatus.bind(this));
|
||||
app.post('/api/mcp/toggle', this.handleToggleMcp.bind(this));
|
||||
|
||||
// Branch switching endpoints
|
||||
app.get('/api/branch/status', this.handleGetBranchStatus.bind(this));
|
||||
app.post('/api/branch/switch', this.handleSwitchBranch.bind(this));
|
||||
app.post('/api/branch/update', this.handleUpdateBranch.bind(this));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get environment settings (from ~/.claude/settings.json)
|
||||
*/
|
||||
private handleGetSettings(req: Request, res: Response): void {
|
||||
try {
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
|
||||
if (!existsSync(settingsPath)) {
|
||||
// Return defaults if file doesn't exist
|
||||
res.json({
|
||||
CLAUDE_MEM_MODEL: 'claude-haiku-4-5',
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATIONS: '50',
|
||||
CLAUDE_MEM_WORKER_PORT: '37777',
|
||||
// Token Economics
|
||||
CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS: 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS: 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT: 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT: 'true',
|
||||
// Observation Filtering
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES: DEFAULT_OBSERVATION_TYPES_STRING,
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS: DEFAULT_OBSERVATION_CONCEPTS_STRING,
|
||||
// Display Configuration
|
||||
CLAUDE_MEM_CONTEXT_FULL_COUNT: '5',
|
||||
CLAUDE_MEM_CONTEXT_FULL_FIELD: 'narrative',
|
||||
CLAUDE_MEM_CONTEXT_SESSION_COUNT: '10',
|
||||
// Feature Toggles
|
||||
CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY: 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE: 'false',
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const settingsData = readFileSync(settingsPath, 'utf-8');
|
||||
const settings = JSON.parse(settingsData);
|
||||
const env = settings.env || {};
|
||||
|
||||
res.json({
|
||||
CLAUDE_MEM_MODEL: env.CLAUDE_MEM_MODEL || 'claude-haiku-4-5',
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATIONS: env.CLAUDE_MEM_CONTEXT_OBSERVATIONS || '50',
|
||||
CLAUDE_MEM_WORKER_PORT: env.CLAUDE_MEM_WORKER_PORT || '37777',
|
||||
// Token Economics
|
||||
CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS: env.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS || 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS: env.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS || 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT: env.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT || 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT: env.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT || 'true',
|
||||
// Observation Filtering
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES: env.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES || DEFAULT_OBSERVATION_TYPES_STRING,
|
||||
CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS: env.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS || DEFAULT_OBSERVATION_CONCEPTS_STRING,
|
||||
// Display Configuration
|
||||
CLAUDE_MEM_CONTEXT_FULL_COUNT: env.CLAUDE_MEM_CONTEXT_FULL_COUNT || '5',
|
||||
CLAUDE_MEM_CONTEXT_FULL_FIELD: env.CLAUDE_MEM_CONTEXT_FULL_FIELD || 'narrative',
|
||||
CLAUDE_MEM_CONTEXT_SESSION_COUNT: env.CLAUDE_MEM_CONTEXT_SESSION_COUNT || '10',
|
||||
// Feature Toggles
|
||||
CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY: env.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY || 'true',
|
||||
CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE: env.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE || 'false',
|
||||
});
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get settings failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update environment settings (in ~/.claude/settings.json) with validation
|
||||
*/
|
||||
private handleUpdateSettings(req: Request, res: Response): void {
|
||||
try {
|
||||
// Validate CLAUDE_MEM_CONTEXT_OBSERVATIONS
|
||||
if (req.body.CLAUDE_MEM_CONTEXT_OBSERVATIONS) {
|
||||
const obsCount = parseInt(req.body.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10);
|
||||
if (isNaN(obsCount) || obsCount < 1 || obsCount > 200) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'CLAUDE_MEM_CONTEXT_OBSERVATIONS must be between 1 and 200'
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Validate CLAUDE_MEM_WORKER_PORT
|
||||
if (req.body.CLAUDE_MEM_WORKER_PORT) {
|
||||
const port = parseInt(req.body.CLAUDE_MEM_WORKER_PORT, 10);
|
||||
if (isNaN(port) || port < 1024 || port > 65535) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: 'CLAUDE_MEM_WORKER_PORT must be between 1024 and 65535'
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Validate context settings
|
||||
const validation = this.validateContextSettings(req.body);
|
||||
if (!validation.valid) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: validation.error
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Read existing settings
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
let settings: any = { env: {} };
|
||||
|
||||
if (existsSync(settingsPath)) {
|
||||
const settingsData = readFileSync(settingsPath, 'utf-8');
|
||||
settings = JSON.parse(settingsData);
|
||||
if (!settings.env) {
|
||||
settings.env = {};
|
||||
}
|
||||
}
|
||||
|
||||
// Update all settings from request body
|
||||
const settingKeys = [
|
||||
'CLAUDE_MEM_MODEL',
|
||||
'CLAUDE_MEM_CONTEXT_OBSERVATIONS',
|
||||
'CLAUDE_MEM_WORKER_PORT',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT',
|
||||
'CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES',
|
||||
'CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS',
|
||||
'CLAUDE_MEM_CONTEXT_FULL_COUNT',
|
||||
'CLAUDE_MEM_CONTEXT_FULL_FIELD',
|
||||
'CLAUDE_MEM_CONTEXT_SESSION_COUNT',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE',
|
||||
];
|
||||
|
||||
for (const key of settingKeys) {
|
||||
if (req.body[key] !== undefined) {
|
||||
settings.env[key] = req.body[key];
|
||||
}
|
||||
}
|
||||
|
||||
// Write back
|
||||
writeFileSync(settingsPath, JSON.stringify(settings, null, 2), 'utf-8');
|
||||
|
||||
logger.info('WORKER', 'Settings updated');
|
||||
res.json({ success: true, message: 'Settings updated successfully' });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Update settings failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: String(error) });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mcp/status - Check if MCP search server is enabled
|
||||
*/
|
||||
private handleGetMcpStatus(req: Request, res: Response): void {
|
||||
try {
|
||||
const enabled = this.isMcpEnabled();
|
||||
res.json({ enabled });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Get MCP status failed', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mcp/toggle - Toggle MCP search server on/off
|
||||
* Body: { enabled: boolean }
|
||||
*/
|
||||
private handleToggleMcp(req: Request, res: Response): void {
|
||||
try {
|
||||
const { enabled } = req.body;
|
||||
|
||||
if (typeof enabled !== 'boolean') {
|
||||
res.status(400).json({ error: 'enabled must be a boolean' });
|
||||
return;
|
||||
}
|
||||
|
||||
this.toggleMcp(enabled);
|
||||
res.json({ success: true, enabled: this.isMcpEnabled() });
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Toggle MCP failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/branch/status - Get current branch information
|
||||
*/
|
||||
private handleGetBranchStatus(req: Request, res: Response): void {
|
||||
try {
|
||||
const info = getBranchInfo();
|
||||
res.json(info);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Failed to get branch status', {}, error as Error);
|
||||
res.status(500).json({ error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/branch/switch - Switch to a different branch
|
||||
* Body: { branch: "main" | "beta/7.0" }
|
||||
*/
|
||||
private async handleSwitchBranch(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
const { branch } = req.body;
|
||||
|
||||
if (!branch) {
|
||||
res.status(400).json({ success: false, error: 'Missing branch parameter' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Validate branch name
|
||||
const allowedBranches = ['main', 'beta/7.0'];
|
||||
if (!allowedBranches.includes(branch)) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: `Invalid branch. Allowed: ${allowedBranches.join(', ')}`
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('WORKER', 'Branch switch requested', { branch });
|
||||
|
||||
const result = await switchBranch(branch);
|
||||
|
||||
if (result.success) {
|
||||
// Schedule worker restart after response is sent
|
||||
setTimeout(() => {
|
||||
logger.info('WORKER', 'Restarting worker after branch switch');
|
||||
process.exit(0); // PM2 will restart the worker
|
||||
}, 1000);
|
||||
}
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Branch switch failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/branch/update - Pull latest updates for current branch
|
||||
*/
|
||||
private async handleUpdateBranch(req: Request, res: Response): Promise<void> {
|
||||
try {
|
||||
logger.info('WORKER', 'Branch update requested');
|
||||
|
||||
const result = await pullUpdates();
|
||||
|
||||
if (result.success) {
|
||||
// Schedule worker restart after response is sent
|
||||
setTimeout(() => {
|
||||
logger.info('WORKER', 'Restarting worker after branch update');
|
||||
process.exit(0); // PM2 will restart the worker
|
||||
}, 1000);
|
||||
}
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Branch update failed', {}, error as Error);
|
||||
res.status(500).json({ success: false, error: (error as Error).message });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate context settings from request body
|
||||
*/
|
||||
private validateContextSettings(settings: any): { valid: boolean; error?: string } {
|
||||
// Validate boolean string values
|
||||
const booleanSettings = [
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY',
|
||||
'CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE',
|
||||
];
|
||||
|
||||
for (const key of booleanSettings) {
|
||||
if (settings[key] && !['true', 'false'].includes(settings[key])) {
|
||||
return { valid: false, error: `${key} must be "true" or "false"` };
|
||||
}
|
||||
}
|
||||
|
||||
// Validate FULL_COUNT (0-20)
|
||||
if (settings.CLAUDE_MEM_CONTEXT_FULL_COUNT) {
|
||||
const count = parseInt(settings.CLAUDE_MEM_CONTEXT_FULL_COUNT, 10);
|
||||
if (isNaN(count) || count < 0 || count > 20) {
|
||||
return { valid: false, error: 'CLAUDE_MEM_CONTEXT_FULL_COUNT must be between 0 and 20' };
|
||||
}
|
||||
}
|
||||
|
||||
// Validate SESSION_COUNT (1-50)
|
||||
if (settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT) {
|
||||
const count = parseInt(settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT, 10);
|
||||
if (isNaN(count) || count < 1 || count > 50) {
|
||||
return { valid: false, error: 'CLAUDE_MEM_CONTEXT_SESSION_COUNT must be between 1 and 50' };
|
||||
}
|
||||
}
|
||||
|
||||
// Validate FULL_FIELD
|
||||
if (settings.CLAUDE_MEM_CONTEXT_FULL_FIELD) {
|
||||
if (!['narrative', 'facts'].includes(settings.CLAUDE_MEM_CONTEXT_FULL_FIELD)) {
|
||||
return { valid: false, error: 'CLAUDE_MEM_CONTEXT_FULL_FIELD must be "narrative" or "facts"' };
|
||||
}
|
||||
}
|
||||
|
||||
// Validate observation types
|
||||
if (settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES) {
|
||||
const types = settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim());
|
||||
for (const type of types) {
|
||||
if (type && !OBSERVATION_TYPES.includes(type as any)) {
|
||||
return { valid: false, error: `Invalid observation type: ${type}. Valid types: ${OBSERVATION_TYPES.join(', ')}` };
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Validate observation concepts
|
||||
if (settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS) {
|
||||
const concepts = settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim());
|
||||
for (const concept of concepts) {
|
||||
if (concept && !OBSERVATION_CONCEPTS.includes(concept as any)) {
|
||||
return { valid: false, error: `Invalid observation concept: ${concept}. Valid concepts: ${OBSERVATION_CONCEPTS.join(', ')}` };
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { valid: true };
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if MCP search server is enabled
|
||||
*/
|
||||
private isMcpEnabled(): boolean {
|
||||
const packageRoot = getPackageRoot();
|
||||
const mcpPath = path.join(packageRoot, 'plugin', '.mcp.json');
|
||||
return existsSync(mcpPath);
|
||||
}
|
||||
|
||||
/**
|
||||
* Toggle MCP search server (rename .mcp.json <-> .mcp.json.disabled)
|
||||
*/
|
||||
private toggleMcp(enabled: boolean): void {
|
||||
try {
|
||||
const packageRoot = getPackageRoot();
|
||||
const mcpPath = path.join(packageRoot, 'plugin', '.mcp.json');
|
||||
const mcpDisabledPath = path.join(packageRoot, 'plugin', '.mcp.json.disabled');
|
||||
|
||||
if (enabled && existsSync(mcpDisabledPath)) {
|
||||
// Enable: rename .mcp.json.disabled -> .mcp.json
|
||||
renameSync(mcpDisabledPath, mcpPath);
|
||||
logger.info('WORKER', 'MCP search server enabled');
|
||||
} else if (!enabled && existsSync(mcpPath)) {
|
||||
// Disable: rename .mcp.json -> .mcp.json.disabled
|
||||
renameSync(mcpPath, mcpDisabledPath);
|
||||
logger.info('WORKER', 'MCP search server disabled');
|
||||
} else {
|
||||
logger.debug('WORKER', 'MCP toggle no-op (already in desired state)', { enabled });
|
||||
}
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Failed to toggle MCP', { enabled }, error as Error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,82 @@
|
||||
/**
|
||||
* Viewer Routes
|
||||
*
|
||||
* Handles health check, viewer UI, and SSE stream endpoints.
|
||||
* These are used by the web viewer UI at http://localhost:37777
|
||||
*/
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import path from 'path';
|
||||
import { readFileSync } from 'fs';
|
||||
import { getPackageRoot } from '../../../../shared/paths.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
import { SSEBroadcaster } from '../../SSEBroadcaster.js';
|
||||
import { DatabaseManager } from '../../DatabaseManager.js';
|
||||
import { SessionManager } from '../../SessionManager.js';
|
||||
|
||||
export class ViewerRoutes {
|
||||
constructor(
|
||||
private sseBroadcaster: SSEBroadcaster,
|
||||
private dbManager: DatabaseManager,
|
||||
private sessionManager: SessionManager
|
||||
) {}
|
||||
|
||||
setupRoutes(app: express.Application): void {
|
||||
app.get('/health', this.handleHealth.bind(this));
|
||||
app.get('/', this.handleViewerUI.bind(this));
|
||||
app.get('/stream', this.handleSSEStream.bind(this));
|
||||
}
|
||||
|
||||
/**
|
||||
* Health check endpoint
|
||||
*/
|
||||
private handleHealth(req: Request, res: Response): void {
|
||||
res.json({ status: 'ok', timestamp: Date.now() });
|
||||
}
|
||||
|
||||
/**
|
||||
* Serve viewer UI
|
||||
*/
|
||||
private handleViewerUI(req: Request, res: Response): void {
|
||||
try {
|
||||
const packageRoot = getPackageRoot();
|
||||
const viewerPath = path.join(packageRoot, 'plugin', 'ui', 'viewer.html');
|
||||
const html = readFileSync(viewerPath, 'utf-8');
|
||||
res.setHeader('Content-Type', 'text/html');
|
||||
res.send(html);
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', 'Viewer UI error', {}, error as Error);
|
||||
res.status(500).json({ error: 'Failed to load viewer UI' });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* SSE stream endpoint
|
||||
*/
|
||||
private handleSSEStream(req: Request, res: Response): void {
|
||||
// Setup SSE headers
|
||||
res.setHeader('Content-Type', 'text/event-stream');
|
||||
res.setHeader('Cache-Control', 'no-cache');
|
||||
res.setHeader('Connection', 'keep-alive');
|
||||
|
||||
// Add client to broadcaster
|
||||
this.sseBroadcaster.addClient(res);
|
||||
|
||||
// Send initial_load event with projects list
|
||||
const allProjects = this.dbManager.getSessionStore().getAllProjects();
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'initial_load',
|
||||
projects: allProjects,
|
||||
timestamp: Date.now()
|
||||
});
|
||||
|
||||
// Send initial processing status (based on queue depth + active generators)
|
||||
const isProcessing = this.sessionManager.isAnySessionProcessing();
|
||||
const queueDepth = this.sessionManager.getTotalActiveWork(); // Includes queued + actively processing
|
||||
this.sseBroadcaster.broadcast({
|
||||
type: 'processing_status',
|
||||
isProcessing,
|
||||
queueDepth
|
||||
});
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user