fix: comprehensive error handling improvements and architecture documentation (#522)

* Add enforceable anti-pattern detection for try-catch abuse

PROBLEM:
- Overly-broad try-catch blocks waste 10+ hours of debugging time
- Empty catch blocks silently swallow errors
- AI assistants use try-catch to paper over uncertainty instead of doing research

SOLUTION:
1. Created detect-error-handling-antipatterns.ts test
   - Detects empty catch blocks (45 CRITICAL found)
   - Detects catch without logging (45 CRITICAL total)
   - Detects large try blocks (>10 lines)
   - Detects generic catch without type checking
   - Detects catch-and-continue on critical paths
   - Exit code 1 if critical issues found

2. Updated CLAUDE.md with MANDATORY ERROR HANDLING RULES
   - 5-question pre-flight checklist before any try-catch
   - FORBIDDEN patterns with examples
   - ALLOWED patterns with examples
   - Meta-rule: UNCERTAINTY TRIGGERS RESEARCH, NOT TRY-CATCH
   - Critical path protection list

3. Created comprehensive try-catch audit report
   - Documents all 96 try-catch blocks in worker service
   - Identifies critical issue at worker-service.ts:748-750
   - Categorizes patterns and provides recommendations

This is enforceable via test, not just instructions that can be ignored.

Current state: 163 anti-patterns detected (45 critical, 47 high, 71 medium)
Next: Fix critical issues identified by test

🤖 Generated with Claude Code
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix: add logging to 5 critical empty catch blocks (Wave 1)

Wave 1 of error handling cleanup - fixing empty catch blocks that
silently swallow errors without any trace.

Fixed files:
- src/bin/import-xml-observations.ts:80 - Log skipped invalid JSON
- src/utils/bun-path.ts:33 - Log when bun not in PATH
- src/utils/cursor-utils.ts:44 - Log failed registry reads
- src/utils/cursor-utils.ts:149 - Log corrupt MCP config
- src/shared/worker-utils.ts:128 - Log failed health checks

All catch blocks now have proper logging with context and error details.

Progress: 41 → 39 CRITICAL issues remaining

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix: add logging to promise catches on critical paths (Wave 2)

Wave 2 of error handling cleanup - fixing empty promise catch handlers
that silently swallow errors on critical code paths. These are the
patterns that caused the 10-hour debugging session.

Fixed empty promise catches:
- worker-service.ts:642 - Background initialization failures
- SDKAgent.ts:372,446 - Session processor errors
- GeminiAgent.ts:408,475 - Finalization failures
- OpenRouterAgent.ts:451,518 - Finalization failures
- SessionManager.ts:289 - Generator promise failures

Added justification comments to catch-and-continue blocks:
- worker-service.ts:68 - PID file removal (cleanup, non-critical)
- worker-service.ts:130 - Cursor context update (non-critical)

All promise rejection handlers now log errors with context, preventing
silent failures that were nearly impossible to debug.

Note: The anti-pattern detector only tracks try-catch blocks, not
standalone promise chains. These fixes address the root cause of the
original 10-hour debugging session even though the detector count
remains unchanged.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix: add logging and documentation to error handling patterns (Wave 3)

Wave 3 of error handling cleanup - comprehensive review and fixes for
remaining critical issues identified by the anti-pattern detector.

Changes organized by severity:

**Wave 3.1: Fixed 2 EMPTY_CATCH blocks**
- worker-service.ts:162 - Health check polling now logs failures
- worker-service.ts:610 - Process cleanup logs failures

**Wave 3.2: Reviewed 12 CATCH_AND_CONTINUE patterns**
- Verified all are correct (log errors AND exit/return HTTP errors)
- Added justification comment to session recovery (line 829)
- All patterns properly notify callers of failures

**Wave 3.3: Fixed 29 NO_LOGGING_IN_CATCH issues**

Added logging to 16 catch blocks:
- UI layer: useSettings.ts, useContextPreview.ts (console logging)
- Servers: mcp-server.ts health checks and tool execution
- Worker: version fetch, cleanup, config corruption
- Routes: error handler, session recovery, settings validation
- Services: branch checkout, timeline queries

Documented 13 intentional exceptions with comments explaining why:
- Hot paths (port checks, process checks in tight loops)
- Error accumulation (transcript parser collects for batch retrieval)
- Special cases (logger can't log its own failures)
- Fallback parsing (JSON parse in optional data structures)

All changes follow error handling guidelines from CLAUDE.md:
- Appropriate log levels (error/warn/debug)
- Context objects with relevant details
- Descriptive messages explaining failures
- Error extraction pattern for Error instances

Progress: 41 → 29 detector warnings
Remaining warnings are conservative flags on verified-correct patterns
(catch-and-continue blocks that properly log + notify callers).

Build verified successful. All error handling now provides visibility
for debugging while avoiding excessive logging on hot paths.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* feat: add queue:clear command to remove failed messages

Added functionality to clear failed messages from the observation queue:

**Changes:**
- PendingMessageStore: Added clearFailed() method to delete failed messages
- DataRoutes: Added DELETE /api/pending-queue/failed endpoint
- CLI: Created scripts/clear-failed-queue.ts for interactive queue clearing
- package.json: Added npm run queue:clear script

**Usage:**
  npm run queue:clear          # Interactive - prompts for confirmation
  npm run queue:clear -- --force  # Non-interactive - clears without prompt

Failed messages are observations that exceeded max retry count. They
remain in the queue for debugging but won't be processed. This command
removes them to clean up the queue.

Works alongside existing queue:check and queue:process commands to
provide complete queue management capabilities.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* feat: add --all flag to queue:clear for complete queue reset

Extended queue clearing functionality to support clearing all messages,
not just failed ones.

**Changes:**
- PendingMessageStore: Added clearAll() method to clear pending, processing, and failed
- DataRoutes: Added DELETE /api/pending-queue/all endpoint
- clear-failed-queue.ts: Added --all flag to clear everything
- Updated help text and UI to distinguish between failed-only and all-clear modes

**Usage:**
  npm run queue:clear              # Clear failed only (interactive)
  npm run queue:clear -- --all     # Clear ALL messages (interactive)
  npm run queue:clear -- --all --force  # Clear all without confirmation

The --all flag provides a complete queue reset, removing pending,
processing, and failed messages. Useful when you want a fresh start
or need to cancel stuck sessions.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* feat: add comprehensive documentation for session ID architecture and validation tests

* feat: add logs viewer with clear functionality to UI

- Add LogsRoutes API endpoint for fetching and clearing worker logs
- Create LogsModal component with auto-refresh and clear button
- Integrate logs viewer button into Header component
- Add comprehensive CSS styling for logs modal
- Logs accessible via new document icon button in header

Logs viewer features:
- Display last 1000 lines of current day's log file
- Auto-refresh toggle (2s interval)
- Clear logs button with confirmation
- Monospace font for readable log output
- Responsive modal design matching existing UI

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* refactor: redesign logs as Chrome DevTools-style console drawer

Major UX improvements to match Chrome DevTools console:
- Convert from modal to bottom drawer that slides up
- Move toggle button to bottom-left corner (floating button)
- Add draggable resize handle for height adjustment
- Use plain monospace font (SF Mono/Monaco/Consolas) instead of Monaspace
- Simplify controls with icon-only buttons
- Add Console tab UI matching DevTools aesthetic

Changes:
- Renamed LogsModal to LogsDrawer with drawer implementation
- Added resize functionality with mouse drag
- Removed logs button from header
- Added floating console toggle button in bottom-left
- Updated all CSS to match Chrome console styling
- Minimum height: 150px, maximum: window height - 100px

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix: suppress /api/logs endpoint logging to reduce noise

Skip logging GET /api/logs requests in HTTP middleware to prevent
log spam from auto-refresh polling (every 2s). Keeps the auto-refresh
feature functional while eliminating the repetitive log entries.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* refactor: enhance error handling guidelines with approved overrides for justified exceptions

---------

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Alex Newman
2026-01-01 23:38:22 -05:00
committed by GitHub
parent c2fbb39fd0
commit 417acb0f81
46 changed files with 2563 additions and 196 deletions
+8 -2
View File
@@ -8,6 +8,7 @@ import { readFileSync, readdirSync } from 'fs';
import { join } from 'path';
import { homedir } from 'os';
import { SessionStore } from '../services/sqlite/SessionStore.js';
import { logger } from '../utils/logger.js';
interface ObservationData {
type: string;
@@ -56,7 +57,8 @@ function buildTimestampMap(): TimestampMapping {
const content = readFileSync(filepath, 'utf-8');
const lines = content.split('\n').filter(l => l.trim());
for (const line of lines) {
for (let index = 0; index < lines.length; index++) {
const line = lines[index];
try {
const data = JSON.parse(line);
const timestamp = data.timestamp;
@@ -75,7 +77,11 @@ function buildTimestampMap(): TimestampMapping {
}
}
} catch (e) {
// Skip invalid JSON lines
logger.debug('IMPORT', 'Skipping invalid JSON line', {
lineNumber: index + 1,
filename,
error: e instanceof Error ? e.message : String(e)
});
}
}
}
+6 -2
View File
@@ -96,13 +96,17 @@ export function buildObservationPrompt(obs: Observation): string {
try {
toolInput = typeof obs.tool_input === 'string' ? JSON.parse(obs.tool_input) : obs.tool_input;
} catch {
} catch (error) {
// Expected: tool_input may not be valid JSON (e.g., plain strings)
// Not logging - this is a normal fallback for non-JSON tool inputs
toolInput = obs.tool_input; // If parse fails, use raw value
}
try {
toolOutput = typeof obs.tool_output === 'string' ? JSON.parse(obs.tool_output) : obs.tool_output;
} catch {
} catch (error) {
// Expected: tool_output may not be valid JSON (e.g., plain strings)
// Not logging - this is a normal fallback for non-JSON tool outputs
toolOutput = obs.tool_output; // If parse fails, use raw value
}
+3
View File
@@ -140,6 +140,8 @@ async function verifyWorkerConnection(): Promise<boolean> {
const response = await fetch(`${WORKER_BASE_URL}/api/health`);
return response.ok;
} catch (error) {
// Expected during worker startup or if worker is down
logger.debug('SYSTEM', 'Worker health check failed', undefined, { error: error instanceof Error ? error.message : String(error) });
return false;
}
}
@@ -265,6 +267,7 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
try {
return await tool.handler(request.params.arguments || {});
} catch (error: any) {
logger.error('SYSTEM', 'Tool execution failed', undefined, { tool: request.params.name, error: error.message });
return {
content: [{
type: 'text' as const,
+4 -1
View File
@@ -200,6 +200,8 @@ function extractPriorMessages(transcriptPath: string): { userMessage: string; as
}
}
} catch (parseError) {
// Expected: malformed JSON lines in transcript
// Not logging - this loops through many lines, logging each would be excessive
continue;
}
}
@@ -227,7 +229,8 @@ export async function generateContext(input?: ContextInput, useColors: boolean =
try {
unlinkSync(VERSION_MARKER_PATH);
} catch (unlinkError) {
// Marker might not exist
// Marker might not exist - expected during first run
// Not logging - this is a normal case during initial setup
}
logger.error('SYSTEM', 'Native module rebuild needed - restart Claude Code to auto-fix');
return '';
@@ -384,6 +384,33 @@ export class PendingMessageStore {
return result.changes;
}
/**
* Clear all failed messages from the queue
* @returns Number of messages deleted
*/
clearFailed(): number {
const stmt = this.db.prepare(`
DELETE FROM pending_messages
WHERE status = 'failed'
`);
const result = stmt.run();
return result.changes;
}
/**
* Clear all pending, processing, and failed messages from the queue
* Keeps only processed messages (for history)
* @returns Number of messages deleted
*/
clearAll(): number {
const stmt = this.db.prepare(`
DELETE FROM pending_messages
WHERE status IN ('pending', 'processing', 'failed')
`);
const result = stmt.run();
return result.changes;
}
/**
* Convert a PersistentPendingMessage back to PendingMessage format
*/
+36 -8
View File
@@ -66,6 +66,7 @@ function removePidFile(): void {
try {
if (existsSync(PID_FILE)) unlinkSync(PID_FILE);
} catch (error) {
// PID file removal is cleanup - log but don't fail shutdown
logger.warn('SYSTEM', 'Failed to remove PID file', { path: PID_FILE, error: (error as Error).message });
}
}
@@ -128,6 +129,7 @@ export async function updateCursorContextForProject(projectName: string, port: n
writeContextFile(entry.workspacePath, context);
logger.debug('CURSOR', 'Updated context file', { projectName, workspacePath: entry.workspacePath });
} catch (error) {
// Context update is non-critical - log and continue
logger.warn('CURSOR', 'Failed to update context file', { projectName, error: (error as Error).message });
}
}
@@ -147,7 +149,11 @@ async function isPortInUse(port: number): Promise<boolean> {
// Note: Removed AbortSignal.timeout to avoid Windows Bun cleanup issue (libuv assertion)
const response = await fetch(`http://127.0.0.1:${port}/api/health`);
return response.ok;
} catch { return false; }
} catch (error) {
// Expected: port is free or service not responding
// Not logging - this is called frequently for health checks
return false;
}
}
async function waitForHealth(port: number, timeoutMs: number = 30000): Promise<boolean> {
@@ -157,8 +163,11 @@ async function waitForHealth(port: number, timeoutMs: number = 30000): Promise<b
// Note: Removed AbortSignal.timeout to avoid Windows Bun cleanup issue (libuv assertion)
const response = await fetch(`http://127.0.0.1:${port}/api/readiness`);
if (response.ok) return true;
} catch {
// Not ready yet
} catch (error) {
logger.debug('SYSTEM', 'Service not ready yet, will retry', {
port,
error: error instanceof Error ? error.message : String(error)
});
}
await new Promise(r => setTimeout(r, 500));
}
@@ -215,6 +224,8 @@ async function getRunningWorkerVersion(port: number): Promise<string | null> {
const data = await response.json() as { version: string };
return data.version;
} catch {
// Expected: worker not running or version endpoint unavailable
logger.debug('SYSTEM', 'Could not fetch worker version', { port });
return null;
}
}
@@ -256,6 +267,7 @@ import { SessionRoutes } from './worker/http/routes/SessionRoutes.js';
import { DataRoutes } from './worker/http/routes/DataRoutes.js';
import { SearchRoutes } from './worker/http/routes/SearchRoutes.js';
import { SettingsRoutes } from './worker/http/routes/SettingsRoutes.js';
import { LogsRoutes } from './worker/http/routes/LogsRoutes.js';
export class WorkerService {
private app: express.Application;
@@ -285,6 +297,7 @@ export class WorkerService {
private dataRoutes: DataRoutes;
private searchRoutes: SearchRoutes | null;
private settingsRoutes: SettingsRoutes;
private logsRoutes: LogsRoutes;
// Initialization tracking
private initializationComplete: Promise<void>;
@@ -329,6 +342,7 @@ export class WorkerService {
// SearchRoutes needs SearchManager which requires initialized DB - will be created in initializeBackground()
this.searchRoutes = null;
this.settingsRoutes = new SettingsRoutes(this.settingsManager);
this.logsRoutes = new LogsRoutes();
this.setupMiddleware();
this.setupRoutes();
@@ -503,6 +517,7 @@ export class WorkerService {
this.dataRoutes.setupRoutes(this.app);
// searchRoutes is set up after database initialization in initializeBackground()
this.settingsRoutes.setupRoutes(this.app);
this.logsRoutes.setupRoutes(this.app);
// Register early handler for /api/context/inject to avoid 404 during startup
// This handler waits for initialization to complete before delegating to SearchRoutes
@@ -605,8 +620,11 @@ export class WorkerService {
}
try {
execSync(`taskkill /PID ${pid} /T /F`, { timeout: 60000, stdio: 'ignore' });
} catch {
// Process may have already exited - continue cleanup
} catch (error) {
logger.debug('SYSTEM', 'Failed to kill process, may have already exited', {
pid,
error: error instanceof Error ? error.message : String(error)
});
}
}
} else {
@@ -614,7 +632,8 @@ export class WorkerService {
try {
process.kill(pid, 'SIGKILL');
} catch {
// Process already exited - that's fine
// Process already exited - expected during cleanup
logger.debug('SYSTEM', 'Process already exited', { pid });
}
}
}
@@ -747,6 +766,11 @@ export class WorkerService {
session.generatorPromise = this.sdkAgent.startSession(session, this)
.catch(error => {
logger.error('SDK', 'Session generator failed', {
sessionId: session.sessionDbId,
project: session.project
}, error as Error);
// Note: Error is logged but not rethrown - session marked as complete via finally
})
.finally(() => {
session.generatorPromise = null;
@@ -814,6 +838,7 @@ export class WorkerService {
// Small delay between sessions to avoid rate limiting
await new Promise(resolve => setTimeout(resolve, 100));
} catch (error) {
// Recovery is best-effort - skip failed sessions and continue with others
logger.warn('SYSTEM', `Failed to process session ${sessionDbId}`, {}, error as Error);
result.sessionsSkipped++;
}
@@ -978,7 +1003,9 @@ export class WorkerService {
try {
process.kill(pid, 0);
return true;
} catch {
} catch (error) {
// Expected: process has exited
// Not logging - this is called in a tight loop during cleanup
return false;
}
});
@@ -1385,8 +1412,9 @@ function configureCursorMcp(target: string): number {
if (!config.mcpServers) {
config.mcpServers = {};
}
} catch {
} catch (error) {
// Start fresh if corrupt
logger.warn('SYSTEM', 'Corrupt mcp.json, creating new config', { path: mcpJsonPath, error: error instanceof Error ? error.message : String(error) });
config = { mcpServers: {} };
}
}
+2 -1
View File
@@ -204,8 +204,9 @@ export async function switchBranch(targetBranch: string): Promise<SwitchResult>
logger.debug('BRANCH', 'Checking out branch', { branch: targetBranch });
try {
execGit(['checkout', targetBranch]);
} catch {
} catch (error) {
// Branch might not exist locally, try tracking remote
logger.debug('BRANCH', 'Branch not local, tracking remote', { branch: targetBranch, error: error instanceof Error ? error.message : String(error) });
execGit(['checkout', '-b', targetBranch, `origin/${targetBranch}`]);
}
+4 -2
View File
@@ -495,9 +495,11 @@ export class GeminiAgent {
}
});
}
// Update Cursor context file for registered projects (fire-and-forget)
updateCursorContextForProject(session.project, getWorkerPort()).catch(() => {});
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
});
}
// Mark messages as processed
+4 -2
View File
@@ -538,9 +538,11 @@ export class OpenRouterAgent {
}
});
}
// Update Cursor context file for registered projects (fire-and-forget)
updateCursorContextForProject(session.project, getWorkerPort()).catch(() => {});
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
});
}
// Mark messages as processed
+3 -2
View File
@@ -51,8 +51,9 @@ export class PaginationHelper {
// Return as JSON string
return JSON.stringify(strippedPaths);
} catch (error) {
// If parsing fails, return original string
} catch (err) {
// Expected: file paths may not be valid JSON (plain string)
// Not logging - normal fallback for non-JSON file path strings
return filePathsStr;
}
}
+4 -2
View File
@@ -469,9 +469,11 @@ export class SDKAgent {
}
});
}
// Update Cursor context file for registered projects (fire-and-forget)
updateCursorContextForProject(session.project, getWorkerPort()).catch(() => {});
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
});
}
// Mark messages as processed after successful observation/summary storage
+7 -2
View File
@@ -1400,7 +1400,9 @@ export class SearchManager {
if (Array.isArray(filesRead) && filesRead.length > 0) {
lines.push(`**Files Read:** ${filesRead.join(', ')}`);
}
} catch {
} catch (error) {
// Expected: files_read may not be valid JSON (plain string)
// Not logging - normal fallback for plain text file lists
if (summary.files_read.trim()) {
lines.push(`**Files Read:** ${summary.files_read}`);
}
@@ -1414,7 +1416,9 @@ export class SearchManager {
if (Array.isArray(filesEdited) && filesEdited.length > 0) {
lines.push(`**Files Edited:** ${filesEdited.join(', ')}`);
}
} catch {
} catch (error) {
// Expected: files_edited may not be valid JSON (plain string)
// Not logging - normal fallback for plain text file lists
if (summary.files_edited.trim()) {
lines.push(`**Files Edited:** ${summary.files_edited}`);
}
@@ -1696,6 +1700,7 @@ export class SearchManager {
}]
};
} catch (error: any) {
logger.error('SEARCH', 'Timeline query failed', { query, anchor }, error);
return {
content: [{
type: 'text' as const,
+3 -1
View File
@@ -286,7 +286,9 @@ export class SessionManager {
// Wait for generator to finish
if (session.generatorPromise) {
await session.generatorPromise.catch(() => {});
await session.generatorPromise.catch(error => {
logger.debug('SYSTEM', 'Generator already failed, cleaning up', { sessionId: session.sessionDbId });
});
}
// Cleanup
@@ -26,6 +26,7 @@ export abstract class BaseRouteHandler {
result.catch(error => this.handleError(res, error as Error));
}
} catch (error) {
logger.error('HTTP', 'Route handler error', { path: req.path }, error as Error);
this.handleError(res, error as Error);
}
};
+3 -2
View File
@@ -29,10 +29,11 @@ export function createMiddleware(
// HTTP request/response logging
middlewares.push((req: Request, res: Response, next: NextFunction) => {
// Skip logging for static assets and health checks
// Skip logging for static assets, health checks, and polling endpoints
const staticExtensions = ['.html', '.js', '.css', '.svg', '.png', '.jpg', '.jpeg', '.webp', '.woff', '.woff2', '.ttf', '.eot'];
const isStaticAsset = staticExtensions.some(ext => req.path.endsWith(ext));
if (req.path.startsWith('/health') || req.path === '/' || isStaticAsset) {
const isPollingEndpoint = req.path === '/api/logs'; // Skip logs endpoint to avoid noise from auto-refresh
if (req.path.startsWith('/health') || req.path === '/' || isStaticAsset || isPollingEndpoint) {
return next();
}
@@ -55,6 +55,8 @@ export class DataRoutes extends BaseRouteHandler {
// Pending queue management endpoints
app.get('/api/pending-queue', this.handleGetPendingQueue.bind(this));
app.post('/api/pending-queue/process', this.handleProcessPendingQueue.bind(this));
app.delete('/api/pending-queue/failed', this.handleClearFailedQueue.bind(this));
app.delete('/api/pending-queue/all', this.handleClearAllQueue.bind(this));
// Import endpoint
app.post('/api/import', this.handleImport.bind(this));
@@ -423,4 +425,42 @@ export class DataRoutes extends BaseRouteHandler {
...result
});
});
/**
* Clear all failed messages from the queue
* DELETE /api/pending-queue/failed
* Returns the number of messages cleared
*/
private handleClearFailedQueue = this.wrapHandler((req: Request, res: Response): void => {
const { PendingMessageStore } = require('../../../sqlite/PendingMessageStore.js');
const pendingStore = new PendingMessageStore(this.dbManager.getSessionStore().db, 3);
const clearedCount = pendingStore.clearFailed();
logger.info('QUEUE', 'Cleared failed queue messages', { clearedCount });
res.json({
success: true,
clearedCount
});
});
/**
* Clear all messages from the queue (pending, processing, and failed)
* DELETE /api/pending-queue/all
* Returns the number of messages cleared
*/
private handleClearAllQueue = this.wrapHandler((req: Request, res: Response): void => {
const { PendingMessageStore } = require('../../../sqlite/PendingMessageStore.js');
const pendingStore = new PendingMessageStore(this.dbManager.getSessionStore().db, 3);
const clearedCount = pendingStore.clearAll();
logger.warn('QUEUE', 'Cleared ALL queue messages (pending, processing, failed)', { clearedCount });
res.json({
success: true,
clearedCount
});
});
}
@@ -0,0 +1,96 @@
/**
* Logs Routes
*
* Handles fetching and clearing log files from ~/.claude-mem/logs/
*/
import express, { Request, Response } from 'express';
import { readFileSync, existsSync, writeFileSync, readdirSync } from 'fs';
import { join } from 'path';
import { logger } from '../../../../utils/logger.js';
import { SettingsDefaultsManager } from '../../../../shared/SettingsDefaultsManager.js';
import { BaseRouteHandler } from '../BaseRouteHandler.js';
export class LogsRoutes extends BaseRouteHandler {
private getLogFilePath(): string {
const dataDir = SettingsDefaultsManager.get('CLAUDE_MEM_DATA_DIR');
const logsDir = join(dataDir, 'logs');
const date = new Date().toISOString().split('T')[0];
return join(logsDir, `claude-mem-${date}.log`);
}
private getLogsDir(): string {
const dataDir = SettingsDefaultsManager.get('CLAUDE_MEM_DATA_DIR');
return join(dataDir, 'logs');
}
setupRoutes(app: express.Application): void {
app.get('/api/logs', this.handleGetLogs.bind(this));
app.post('/api/logs/clear', this.handleClearLogs.bind(this));
}
/**
* GET /api/logs
* Returns the current day's log file contents
* Query params:
* - lines: number of lines to return (default: 1000, max: 10000)
*/
private handleGetLogs = this.wrapHandler((req: Request, res: Response): void => {
const logFilePath = this.getLogFilePath();
if (!existsSync(logFilePath)) {
res.json({
logs: '',
path: logFilePath,
exists: false
});
return;
}
const requestedLines = parseInt(req.query.lines as string || '1000', 10);
const maxLines = Math.min(requestedLines, 10000); // Cap at 10k lines
const content = readFileSync(logFilePath, 'utf-8');
const lines = content.split('\n');
// Return the last N lines
const startIndex = Math.max(0, lines.length - maxLines);
const recentLines = lines.slice(startIndex).join('\n');
res.json({
logs: recentLines,
path: logFilePath,
exists: true,
totalLines: lines.length,
returnedLines: lines.length - startIndex
});
});
/**
* POST /api/logs/clear
* Clears the current day's log file
*/
private handleClearLogs = this.wrapHandler((req: Request, res: Response): void => {
const logFilePath = this.getLogFilePath();
if (!existsSync(logFilePath)) {
res.json({
success: true,
message: 'Log file does not exist',
path: logFilePath
});
return;
}
// Clear the log file by writing empty string
writeFileSync(logFilePath, '', 'utf-8');
logger.info('SYSTEM', 'Log file cleared via UI', { path: logFilePath });
res.json({
success: true,
message: 'Log file cleared',
path: logFilePath
});
});
}
@@ -211,6 +211,7 @@ export class SessionRoutes extends BaseRouteHandler {
}
} catch (e) {
// Ignore errors during recovery check, but still abort to prevent leaks
logger.debug('SESSION', 'Error during recovery check, aborting to prevent leaks', { sessionId: sessionDbId, error: e instanceof Error ? e.message : String(e) });
session.abortController.abort();
}
}
@@ -348,7 +348,9 @@ export class SettingsRoutes extends BaseRouteHandler {
if (settings.CLAUDE_MEM_OPENROUTER_SITE_URL) {
try {
new URL(settings.CLAUDE_MEM_OPENROUTER_SITE_URL);
} catch {
} catch (error) {
// Invalid URL format
logger.debug('SETTINGS', 'Invalid URL format', { url: settings.CLAUDE_MEM_OPENROUTER_SITE_URL, error: error instanceof Error ? error.message : String(error) });
return { valid: false, error: 'CLAUDE_MEM_OPENROUTER_SITE_URL must be a valid URL' };
}
}
+3 -1
View File
@@ -102,7 +102,9 @@ export function getCurrentProjectName(): string {
windowsHide: true
}).trim();
return basename(gitRoot);
} catch {
} catch (error) {
// Expected: not a git repo or git not available
// Not logging - this is a common fallback path
return basename(process.cwd());
}
}
+1
View File
@@ -16,6 +16,7 @@ export function parseJsonArray(json: string | null): string[] {
const parsed = JSON.parse(json);
return Array.isArray(parsed) ? parsed : [];
} catch (err) {
// [APPROVED OVERRIDE]: Expected JSON parse failures for malformed data fields, too frequent to log
return [];
}
}
+6 -2
View File
@@ -124,8 +124,12 @@ export async function ensureWorkerRunning(): Promise<void> {
await checkWorkerVersion(); // logs warning on mismatch, doesn't restart
return;
}
} catch {
// Continue polling
} catch (e) {
logger.debug('SYSTEM', 'Worker health check failed, will retry', {
attempt: i + 1,
maxRetries,
error: e instanceof Error ? e.message : String(e)
});
}
await new Promise(r => setTimeout(r, pollInterval));
}
+171
View File
@@ -2472,6 +2472,177 @@
border-color: var(--color-bg-button-hover);
}
/* Console Drawer - Chrome DevTools Style */
.console-toggle-btn {
position: fixed;
bottom: 20px;
left: 20px;
width: 48px;
height: 48px;
border-radius: 50%;
background: var(--color-bg-button);
border: none;
color: white;
cursor: pointer;
display: flex;
align-items: center;
justify-content: center;
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.2);
transition: all 0.2s ease;
z-index: 999;
}
.console-toggle-btn:hover {
background: var(--color-bg-button-hover);
transform: scale(1.05);
box-shadow: 0 6px 16px rgba(0, 0, 0, 0.3);
}
.console-toggle-btn svg {
width: 20px;
height: 20px;
}
.console-drawer {
position: fixed;
bottom: 0;
left: 0;
right: 0;
background: var(--color-bg-primary);
border-top: 1px solid var(--color-border-primary);
box-shadow: 0 -4px 12px rgba(0, 0, 0, 0.1);
z-index: 1000;
display: flex;
flex-direction: column;
}
.console-resize-handle {
position: absolute;
top: 0;
left: 0;
right: 0;
height: 6px;
cursor: ns-resize;
display: flex;
align-items: center;
justify-content: center;
}
.console-resize-handle:hover .console-resize-bar {
background: var(--color-bg-button);
}
.console-resize-bar {
width: 40px;
height: 3px;
border-radius: 2px;
background: var(--color-border-primary);
transition: background 0.2s ease;
}
.console-header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 8px 12px;
border-bottom: 1px solid var(--color-border-primary);
background: var(--color-bg-header);
margin-top: 6px;
}
.console-tabs {
display: flex;
gap: 4px;
}
.console-tab {
padding: 4px 12px;
font-size: 12px;
color: var(--color-text-secondary);
background: transparent;
border: none;
cursor: pointer;
border-bottom: 2px solid transparent;
}
.console-tab.active {
color: var(--color-text-primary);
border-bottom-color: var(--color-bg-button);
font-weight: 500;
}
.console-controls {
display: flex;
align-items: center;
gap: 8px;
}
.console-auto-refresh {
display: flex;
align-items: center;
gap: 4px;
font-size: 11px;
color: var(--color-text-secondary);
cursor: pointer;
user-select: none;
}
.console-auto-refresh input[type="checkbox"] {
cursor: pointer;
}
.console-control-btn {
background: transparent;
border: none;
color: var(--color-text-secondary);
cursor: pointer;
padding: 4px 8px;
font-size: 14px;
border-radius: 4px;
transition: all 0.15s ease;
}
.console-control-btn:hover {
background: var(--color-bg-card-hover);
color: var(--color-text-primary);
}
.console-control-btn:disabled {
opacity: 0.4;
cursor: not-allowed;
}
.console-clear-btn:hover {
color: var(--color-accent-error);
}
.console-content {
flex: 1;
overflow: auto;
background: var(--color-bg-primary);
}
.console-logs {
margin: 0;
padding: 8px 12px;
font-family: 'SF Mono', 'Monaco', 'Menlo', 'Consolas', 'Courier New', monospace;
font-size: 11px;
line-height: 1.5;
color: var(--color-text-primary);
white-space: pre-wrap;
word-wrap: break-word;
overflow-wrap: break-word;
}
.console-error {
padding: 8px 12px;
background: rgba(239, 68, 68, 0.08);
border-bottom: 1px solid var(--color-accent-error);
color: var(--color-accent-error);
font-size: 11px;
font-family: 'SF Mono', 'Monaco', 'Menlo', 'Consolas', 'Courier New', monospace;
}
/* Responsive Modal */
@media (max-width: 900px) {
.modal-body {
+23
View File
@@ -2,6 +2,7 @@ import React, { useState, useEffect, useCallback, useMemo } from 'react';
import { Header } from './components/Header';
import { Feed } from './components/Feed';
import { ContextSettingsModal } from './components/ContextSettingsModal';
import { LogsDrawer } from './components/LogsModal';
import { useSSE } from './hooks/useSSE';
import { useSettings } from './hooks/useSettings';
import { useStats } from './hooks/useStats';
@@ -13,6 +14,7 @@ import { mergeAndDeduplicateByProject } from './utils/data';
export function App() {
const [currentFilter, setCurrentFilter] = useState('');
const [contextPreviewOpen, setContextPreviewOpen] = useState(false);
const [logsModalOpen, setLogsModalOpen] = useState(false);
const [paginatedObservations, setPaginatedObservations] = useState<Observation[]>([]);
const [paginatedSummaries, setPaginatedSummaries] = useState<Summary[]>([]);
const [paginatedPrompts, setPaginatedPrompts] = useState<UserPrompt[]>([]);
@@ -53,6 +55,11 @@ export function App() {
setContextPreviewOpen(prev => !prev);
}, []);
// Toggle logs modal
const toggleLogsModal = useCallback(() => {
setLogsModalOpen(prev => !prev);
}, []);
// Handle loading more data
const handleLoadMore = useCallback(async () => {
try {
@@ -116,6 +123,22 @@ export function App() {
isSaving={isSaving}
saveStatus={saveStatus}
/>
<button
className="console-toggle-btn"
onClick={toggleLogsModal}
title="Toggle Console"
>
<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" strokeWidth="2" strokeLinecap="round" strokeLinejoin="round">
<polyline points="4 17 10 11 4 5"></polyline>
<line x1="12" y1="19" x2="20" y2="19"></line>
</svg>
</button>
<LogsDrawer
isOpen={logsModalOpen}
onClose={toggleLogsModal}
/>
</>
);
}
+166
View File
@@ -0,0 +1,166 @@
import React, { useState, useEffect, useCallback, useRef } from 'react';
interface LogsDrawerProps {
isOpen: boolean;
onClose: () => void;
}
export function LogsDrawer({ isOpen, onClose }: LogsDrawerProps) {
const [logs, setLogs] = useState<string>('');
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const [autoRefresh, setAutoRefresh] = useState(false);
const [height, setHeight] = useState(300); // Default height
const [isResizing, setIsResizing] = useState(false);
const startYRef = useRef(0);
const startHeightRef = useRef(0);
const fetchLogs = useCallback(async () => {
setIsLoading(true);
setError(null);
try {
const response = await fetch('/api/logs');
if (!response.ok) {
throw new Error(`Failed to fetch logs: ${response.statusText}`);
}
const data = await response.json();
setLogs(data.logs || '');
} catch (err) {
setError(err instanceof Error ? err.message : 'Unknown error');
} finally {
setIsLoading(false);
}
}, []);
const handleClearLogs = useCallback(async () => {
if (!confirm('Are you sure you want to clear all logs?')) {
return;
}
setIsLoading(true);
setError(null);
try {
const response = await fetch('/api/logs/clear', { method: 'POST' });
if (!response.ok) {
throw new Error(`Failed to clear logs: ${response.statusText}`);
}
setLogs('');
} catch (err) {
setError(err instanceof Error ? err.message : 'Unknown error');
} finally {
setIsLoading(false);
}
}, []);
// Handle resize
const handleMouseDown = useCallback((e: React.MouseEvent) => {
e.preventDefault();
setIsResizing(true);
startYRef.current = e.clientY;
startHeightRef.current = height;
}, [height]);
useEffect(() => {
if (!isResizing) return;
const handleMouseMove = (e: MouseEvent) => {
const deltaY = startYRef.current - e.clientY;
const newHeight = Math.min(Math.max(150, startHeightRef.current + deltaY), window.innerHeight - 100);
setHeight(newHeight);
};
const handleMouseUp = () => {
setIsResizing(false);
};
document.addEventListener('mousemove', handleMouseMove);
document.addEventListener('mouseup', handleMouseUp);
return () => {
document.removeEventListener('mousemove', handleMouseMove);
document.removeEventListener('mouseup', handleMouseUp);
};
}, [isResizing]);
// Fetch logs when drawer opens
useEffect(() => {
if (isOpen) {
fetchLogs();
}
}, [isOpen, fetchLogs]);
// Auto-refresh logs every 2 seconds if enabled
useEffect(() => {
if (!isOpen || !autoRefresh) {
return;
}
const interval = setInterval(fetchLogs, 2000);
return () => clearInterval(interval);
}, [isOpen, autoRefresh, fetchLogs]);
if (!isOpen) {
return null;
}
return (
<div className="console-drawer" style={{ height: `${height}px` }}>
<div
className="console-resize-handle"
onMouseDown={handleMouseDown}
>
<div className="console-resize-bar" />
</div>
<div className="console-header">
<div className="console-tabs">
<div className="console-tab active">Console</div>
</div>
<div className="console-controls">
<label className="console-auto-refresh">
<input
type="checkbox"
checked={autoRefresh}
onChange={(e) => setAutoRefresh(e.target.checked)}
/>
Auto-refresh
</label>
<button
className="console-control-btn"
onClick={fetchLogs}
disabled={isLoading}
title="Refresh logs"
>
</button>
<button
className="console-control-btn console-clear-btn"
onClick={handleClearLogs}
disabled={isLoading}
title="Clear logs"
>
🗑
</button>
<button
className="console-control-btn"
onClick={onClose}
title="Close console"
>
</button>
</div>
</div>
{error && (
<div className="console-error">
{error}
</div>
)}
<div className="console-content">
<pre className="console-logs">
{logs || 'No logs available'}
</pre>
</div>
</div>
);
}
+1
View File
@@ -58,6 +58,7 @@ export function useContextPreview(settings: Settings): UseContextPreviewResult {
setError('Failed to load preview');
}
} catch (err) {
console.warn('Failed to load context preview:', err);
setError((err as Error).message);
} finally {
setIsLoading(false);
+1
View File
@@ -78,6 +78,7 @@ export function useSettings() {
setSaveStatus(`✗ Error: ${result.error}`);
}
} catch (error) {
console.error('Failed to save settings:', error);
setSaveStatus(`✗ Error: ${error instanceof Error ? error.message : 'Unknown error'}`);
} finally {
setIsSaving(false);
+5 -2
View File
@@ -9,6 +9,7 @@ import { spawnSync } from 'child_process';
import { existsSync } from 'fs';
import { join } from 'path';
import { homedir } from 'os';
import { logger } from './logger.js';
/**
* Get the Bun executable path
@@ -28,8 +29,10 @@ export function getBunPath(): string | null {
if (result.status === 0) {
return 'bun'; // Available in PATH
}
} catch {
// Not in PATH, continue to check common locations
} catch (e) {
logger.debug('SYSTEM', 'Bun not found in PATH, checking common installation locations', {
error: e instanceof Error ? e.message : String(e)
});
}
// Check common installation paths
+16 -5
View File
@@ -7,6 +7,7 @@
import { existsSync, readFileSync, writeFileSync, mkdirSync, renameSync } from 'fs';
import { join, basename } from 'path';
import { logger } from './logger.js';
// ============================================================================
// Types
@@ -40,7 +41,11 @@ export function readCursorRegistry(registryFile: string): CursorProjectRegistry
try {
if (!existsSync(registryFile)) return {};
return JSON.parse(readFileSync(registryFile, 'utf-8'));
} catch {
} catch (error) {
logger.warn('CONFIG', 'Failed to read Cursor registry, using empty registry', {
file: registryFile,
error: error instanceof Error ? error.message : String(error)
});
return {};
}
}
@@ -145,8 +150,11 @@ export function configureCursorMcp(mcpJsonPath: string, mcpServerScriptPath: str
if (!config.mcpServers) {
config.mcpServers = {};
}
} catch {
// Start fresh if corrupt
} catch (error) {
logger.warn('CONFIG', 'Failed to read MCP config, starting fresh', {
file: mcpJsonPath,
error: error instanceof Error ? error.message : String(error)
});
config = { mcpServers: {} };
}
}
@@ -173,8 +181,11 @@ export function removeMcpConfig(mcpJsonPath: string): void {
delete config.mcpServers['claude-mem'];
writeFileSync(mcpJsonPath, JSON.stringify(config, null, 2));
}
} catch {
// Ignore errors during cleanup
} catch (e) {
logger.warn('CURSOR', 'Failed to remove MCP config during cleanup', {
mcpJsonPath,
error: e instanceof Error ? e.message : String(e)
});
}
}
+2 -1
View File
@@ -267,7 +267,8 @@ class Logger {
try {
appendFileSync(this.logFilePath, logLine + '\n', 'utf8');
} catch (error) {
// If file write fails, write to stderr as last resort
// Logger can't log its own failures - use stderr as last resort
// This is expected during disk full / permission errors
process.stderr.write(`[LOGGER] Failed to write to log file: ${error}\n`);
}
} else {
+2
View File
@@ -42,6 +42,8 @@ export class TranscriptParser {
const entry = JSON.parse(line) as TranscriptEntry;
this.entries.push(entry);
} catch (error) {
// Note: Parse errors are accumulated and accessible via getParseErrors()
// Not logging each individual line failure - would be too verbose for large transcripts
this.parseErrors.push({
lineNumber: index + 1,
error: error instanceof Error ? error.message : String(error),