refactor: Complete rewrite of worker-utils.ts and cleanup of worker-service.ts
- Removed fragile PM2 string parsing and replaced with direct PM2 restart logic. - Eliminated silent error handling in worker-utils.ts for better error visibility. - Extracted duplicated session auto-creation logic into a new helper method getOrCreateSession() in worker-service.ts. - Centralized configuration values and replaced magic numbers with named constants. - Updated health check logic to ensure worker is restarted if unhealthy. - Removed unnecessary getWorkerPort() wrapper function. - Improved overall code quality and maintainability by applying DRY and YAGNI principles.
This commit is contained in:
@@ -0,0 +1,416 @@
|
||||
# Processing Indicator "Fucking Stupid" Audit
|
||||
|
||||
## What It SHOULD Do (Simple Version)
|
||||
|
||||
1. **Page load**: Check if worker is already processing → spin or don't spin
|
||||
2. **UserPromptSubmit**: Start spinning, set worker status "on"
|
||||
3. **Summary complete**: Stop spinning, set worker status "off"
|
||||
|
||||
**Result**: One boolean. Simple. Clear.
|
||||
|
||||
---
|
||||
|
||||
## What It ACTUALLY Does (Overcomplicated Version)
|
||||
|
||||
### Problem 1: Set<string> Instead of Boolean
|
||||
|
||||
**Current**: `processingSessions: Set<string>` - tracks individual session IDs
|
||||
|
||||
**File**: `src/ui/viewer/hooks/useSSE.ts:12`
|
||||
```typescript
|
||||
const [processingSessions, setProcessingSessions] = useState<Set<string>>(new Set());
|
||||
```
|
||||
|
||||
**Why it's stupid**: We don't care WHICH sessions are processing. We just need to know IF anything is processing. The conversion to boolean happens anyway:
|
||||
|
||||
**File**: `src/ui/viewer/App.tsx:92`
|
||||
```typescript
|
||||
isProcessing={processingSessions.size > 0} // ← Converting Set to boolean!
|
||||
```
|
||||
|
||||
**Fix**: Just use `const [isProcessing, setIsProcessing] = useState(false)`
|
||||
|
||||
---
|
||||
|
||||
### Problem 2: Complex Set Manipulation
|
||||
|
||||
**Current**: Add/remove session IDs from Set based on SSE events
|
||||
|
||||
**File**: `src/ui/viewer/hooks/useSSE.ts:90-104`
|
||||
```typescript
|
||||
case 'processing_status':
|
||||
if (data.processing) {
|
||||
const processing = data.processing;
|
||||
console.log('[SSE] Processing status:', processing);
|
||||
setProcessingSessions(prev => {
|
||||
const next = new Set(prev);
|
||||
if (processing.is_processing) {
|
||||
next.add(processing.session_id); // ← Why track session ID?
|
||||
} else {
|
||||
next.delete(processing.session_id); // ← Just need true/false
|
||||
}
|
||||
return next;
|
||||
});
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
**Why it's stupid**: Creating new Sets, adding/removing items, all to track individual sessions when we only care about "any processing yes/no"
|
||||
|
||||
**Fix**: `setIsProcessing(data.is_processing)`
|
||||
|
||||
---
|
||||
|
||||
### Problem 3: Defensive Cleanup in Multiple Places
|
||||
|
||||
**Current**: Two places remove sessions from the Set
|
||||
|
||||
**Location 1** - `useSSE.ts:90-104` - Handles `processing_status` events
|
||||
**Location 2** - `useSSE.ts:73-78` - Handles `new_summary` events
|
||||
|
||||
```typescript
|
||||
// Mark session as no longer processing (summary is the final step)
|
||||
setProcessingSessions(prev => {
|
||||
const next = new Set(prev);
|
||||
next.delete(summary.session_id); // ← Defensive cleanup
|
||||
return next;
|
||||
});
|
||||
```
|
||||
|
||||
**Why it's stupid**: We're defensively cleaning up in case events arrive out of order. This is a band-aid for not having a single source of truth.
|
||||
|
||||
**Fix**: One place sets `isProcessing = false` (summary complete). No defensive cleanup needed.
|
||||
|
||||
---
|
||||
|
||||
### Problem 4: SSE Event Includes Session ID
|
||||
|
||||
**Current**: Processing status events include session ID
|
||||
|
||||
**File**: `src/services/worker-service.ts:277-285`
|
||||
```typescript
|
||||
private broadcastProcessingStatus(claudeSessionId: string, isProcessing: boolean): void {
|
||||
this.broadcastSSE({
|
||||
type: 'processing_status',
|
||||
processing: {
|
||||
session_id: claudeSessionId, // ← Why send session ID?
|
||||
is_processing: isProcessing
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
**Why it's stupid**: We send session_id but never use it for the spinner decision. The logomark doesn't care WHICH session is processing.
|
||||
|
||||
**Fix**: `{ type: 'processing_status', isProcessing: boolean }` - That's it.
|
||||
|
||||
---
|
||||
|
||||
### Problem 5: TypeScript Interface Overcomplicated
|
||||
|
||||
**Current**: StreamEvent includes processing object with session_id
|
||||
|
||||
**File**: `src/ui/viewer/types.ts:54-57`
|
||||
```typescript
|
||||
processing?: {
|
||||
session_id: string; // ← Unnecessary
|
||||
is_processing: boolean;
|
||||
};
|
||||
```
|
||||
|
||||
**Why it's stupid**: Adds complexity to type definitions when we only need the boolean.
|
||||
|
||||
**Fix**: `isProcessing?: boolean;`
|
||||
|
||||
---
|
||||
|
||||
### Problem 6: Multiple Broadcast Points (But No Initial State!)
|
||||
|
||||
**Current**: 3 places broadcast processing status in worker-service.ts
|
||||
|
||||
1. **Line 817**: `handleSummarize()` → `broadcastProcessingStatus(session.claudeSessionId, true)`
|
||||
2. **Line 1153**: `processSummarizeMessage()` success → `broadcastProcessingStatus(session.claudeSessionId, false)`
|
||||
3. **Line 1183**: `processSummarizeMessage()` no summary → `broadcastProcessingStatus(session.claudeSessionId, false)`
|
||||
|
||||
**Why it's stupid**: We broadcast changes but there's NO WAY TO GET INITIAL STATE on page load. If you open the viewer while processing is active, you won't see the spinner until the next status change.
|
||||
|
||||
**Fix**: Add `/api/processing-status` endpoint that returns current state. Call it on page load.
|
||||
|
||||
---
|
||||
|
||||
### Problem 7: Skeleton Cards Require Session Tracking
|
||||
|
||||
**Current**: Feed.tsx creates skeleton cards for each processing session
|
||||
|
||||
**File**: `src/ui/viewer/components/Feed.tsx:66-80`
|
||||
```typescript
|
||||
const skeletons: FeedItem[] = [];
|
||||
processingSessions.forEach(sessionId => { // ← Iterating over Set
|
||||
if (!sessionsWithSummaries.has(sessionId)) {
|
||||
const prompt = sessionPrompts.get(sessionId);
|
||||
skeletons.push({
|
||||
itemType: 'skeleton',
|
||||
id: sessionId,
|
||||
session_id: sessionId, // ← Using individual session IDs
|
||||
project: prompt?.project,
|
||||
created_at_epoch: Date.now()
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
**Why it's relevant**: This is the ONLY place that actually uses individual session IDs. If we want per-session skeleton cards, we need session tracking.
|
||||
|
||||
**Question for you**: Do we still want skeleton cards in the feed? Or just the logomark spinner?
|
||||
|
||||
**Option A**: Keep skeleton cards → Need to track session IDs (current complexity justified)
|
||||
**Option B**: Remove skeleton cards → Use simple boolean for logomark only
|
||||
|
||||
---
|
||||
|
||||
### Problem 8: No Synchronization Between Worker State and UI State
|
||||
|
||||
**Current**: Worker doesn't maintain processing state. It just broadcasts events.
|
||||
|
||||
**Why it's stupid**: If the UI disconnects/reconnects, it loses processing state. Worker should be the source of truth.
|
||||
|
||||
**Fix**: Worker maintains `private isProcessing: boolean = false`
|
||||
- Set to true on summarize request
|
||||
- Set to false when summary completes
|
||||
- Expose via `/api/processing-status` endpoint
|
||||
- Broadcast changes via SSE
|
||||
|
||||
---
|
||||
|
||||
## The "Fucking Stupid" Score
|
||||
|
||||
| Issue | Complexity Cost | Why It's Stupid |
|
||||
|-------|----------------|-----------------|
|
||||
| Set<string> instead of boolean | HIGH | We convert it to boolean anyway |
|
||||
| Complex Set manipulation | HIGH | 10+ lines of code to add/remove from Set |
|
||||
| Defensive cleanup in 2 places | MEDIUM | Band-aid for lack of single source of truth |
|
||||
| SSE includes unused session_id | LOW | Minor overhead, but conceptually wrong |
|
||||
| Overcomplicated TypeScript types | LOW | Makes code harder to read |
|
||||
| No initial state endpoint | HIGH | Broken user experience (no spinner on page load during active processing) |
|
||||
| Session tracking for skeletons | ??? | Depends if we want per-session skeletons or not |
|
||||
| Worker has no state | HIGH | UI is source of truth, should be worker |
|
||||
|
||||
---
|
||||
|
||||
## Proposed Simple Architecture
|
||||
|
||||
### Worker Service (Source of Truth)
|
||||
|
||||
```typescript
|
||||
class WorkerService {
|
||||
private isProcessing: boolean = false; // Single source of truth
|
||||
|
||||
// New endpoint: GET /api/processing-status
|
||||
private handleGetProcessingStatus(req: Request, res: Response): void {
|
||||
res.json({ isProcessing: this.isProcessing });
|
||||
}
|
||||
|
||||
// On summarize request
|
||||
private handleSummarize(req: Request, res: Response): void {
|
||||
// ... existing code ...
|
||||
this.isProcessing = true;
|
||||
this.broadcastSSE({ type: 'processing_status', isProcessing: true });
|
||||
// ...
|
||||
}
|
||||
|
||||
// On summary complete
|
||||
private processSummarizeMessage(session: SessionState, message: Message): void {
|
||||
// ... existing code ...
|
||||
|
||||
// After summary is saved/failed:
|
||||
this.isProcessing = false;
|
||||
this.broadcastSSE({ type: 'processing_status', isProcessing: false });
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### React Hook (Simple Boolean)
|
||||
|
||||
```typescript
|
||||
export function useSSE() {
|
||||
const [isProcessing, setIsProcessing] = useState(false);
|
||||
|
||||
// On mount: Get initial state
|
||||
useEffect(() => {
|
||||
fetch('/api/processing-status')
|
||||
.then(res => res.json())
|
||||
.then(data => setIsProcessing(data.isProcessing));
|
||||
}, []);
|
||||
|
||||
// Listen for changes
|
||||
useEffect(() => {
|
||||
const eventSource = new EventSource('/stream');
|
||||
|
||||
eventSource.onmessage = (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
|
||||
if (data.type === 'processing_status') {
|
||||
setIsProcessing(data.isProcessing); // Simple!
|
||||
}
|
||||
};
|
||||
|
||||
return () => eventSource.close();
|
||||
}, []);
|
||||
|
||||
return { isProcessing, /* other state */ };
|
||||
}
|
||||
```
|
||||
|
||||
### TypeScript Types (Simplified)
|
||||
|
||||
```typescript
|
||||
export interface StreamEvent {
|
||||
type: 'initial_load' | 'new_observation' | 'new_summary' | 'new_prompt' | 'processing_status';
|
||||
observations?: Observation[];
|
||||
summaries?: Summary[];
|
||||
prompts?: UserPrompt[];
|
||||
projects?: string[];
|
||||
observation?: Observation;
|
||||
summary?: Summary;
|
||||
prompt?: UserPrompt;
|
||||
isProcessing?: boolean; // Simple!
|
||||
}
|
||||
```
|
||||
|
||||
### React Components (No Changes Needed!)
|
||||
|
||||
```typescript
|
||||
// App.tsx
|
||||
const { isProcessing } = useSSE(); // Already a boolean now!
|
||||
|
||||
<Header isProcessing={isProcessing} /> // Just pass it through
|
||||
|
||||
// Header.tsx (no changes needed)
|
||||
<img className={`logomark ${isProcessing ? 'spinning' : ''}`} />
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Breaking Changes & Decisions
|
||||
|
||||
### Decision 1: What About Skeleton Cards?
|
||||
|
||||
**Current**: Skeleton cards in feed show "Generating..." for each processing session
|
||||
|
||||
**Options**:
|
||||
|
||||
**A) Keep skeleton cards** (requires session tracking)
|
||||
- Need to track individual session IDs
|
||||
- Justifies the Set<string> complexity
|
||||
- Provides per-session feedback in feed
|
||||
|
||||
**B) Remove skeleton cards** (simplest)
|
||||
- Only logomark spins (global processing indicator)
|
||||
- No need to track individual sessions
|
||||
- Simpler architecture
|
||||
|
||||
**C) Hybrid: Single skeleton card** (middle ground)
|
||||
- Show ONE skeleton card when `isProcessing === true`
|
||||
- Don't tie it to specific sessions
|
||||
- Keep it simple but provide feed feedback
|
||||
|
||||
**What do you want?**
|
||||
|
||||
---
|
||||
|
||||
### Decision 2: Multiple Concurrent Sessions?
|
||||
|
||||
**Question**: Can multiple sessions be processing simultaneously?
|
||||
|
||||
**Current assumption**: Yes (hence the Set<string>)
|
||||
|
||||
**Reality check**: Worker processes messages from a queue. Can it actually process multiple sessions at once, or is it sequential?
|
||||
|
||||
**If sequential**: We DEFINITELY don't need session tracking. One boolean is perfect.
|
||||
|
||||
**If concurrent**: We still might not need session tracking for the logomark (just spin if ANY processing), but skeleton cards would need session IDs.
|
||||
|
||||
---
|
||||
|
||||
## Recommended Implementation Plan
|
||||
|
||||
### Phase 1: Add Initial State (Quick Win)
|
||||
|
||||
**File**: `src/services/worker-service.ts`
|
||||
- Add `private isProcessing: boolean = false;`
|
||||
- Add GET `/api/processing-status` endpoint
|
||||
- Set `this.isProcessing = true` on line 817
|
||||
- Set `this.isProcessing = false` on lines 1153, 1183
|
||||
|
||||
**File**: `src/ui/viewer/hooks/useSSE.ts`
|
||||
- Add `fetch('/api/processing-status')` on mount
|
||||
- Initialize `isProcessing` state from response
|
||||
|
||||
**Impact**: Fixes the "no spinner on page load" bug without breaking changes.
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Simplify State (Breaking Change)
|
||||
|
||||
**File**: `src/services/worker-service.ts`
|
||||
- Change `broadcastProcessingStatus()` to send `{ type: 'processing_status', isProcessing: boolean }`
|
||||
- Remove session_id from broadcast
|
||||
|
||||
**File**: `src/ui/viewer/hooks/useSSE.ts`
|
||||
- Change `processingSessions` Set to `isProcessing` boolean
|
||||
- Simplify event handler: `setIsProcessing(data.isProcessing)`
|
||||
- Remove defensive cleanup from `new_summary` handler
|
||||
|
||||
**File**: `src/ui/viewer/types.ts`
|
||||
- Simplify `StreamEvent.processing` to just `isProcessing?: boolean`
|
||||
|
||||
**File**: `src/ui/viewer/App.tsx`
|
||||
- Change `processingSessions.size > 0` to just `isProcessing`
|
||||
|
||||
**File**: `src/ui/viewer/components/Feed.tsx`
|
||||
- **Decision needed**: Remove skeleton cards or show single generic skeleton?
|
||||
|
||||
**Impact**: Cleaner code, easier to maintain, fewer bugs.
|
||||
|
||||
---
|
||||
|
||||
## Files That Need Changes
|
||||
|
||||
### Worker Service
|
||||
- `src/services/worker-service.ts` (add state, endpoint, update broadcasts)
|
||||
|
||||
### React
|
||||
- `src/ui/viewer/hooks/useSSE.ts` (boolean instead of Set, fetch initial state)
|
||||
- `src/ui/viewer/types.ts` (simplify StreamEvent)
|
||||
- `src/ui/viewer/App.tsx` (pass boolean instead of Set.size > 0)
|
||||
- `src/ui/viewer/components/Feed.tsx` (handle skeleton cards decision)
|
||||
- `src/ui/viewer/constants/api.ts` (add PROCESSING_STATUS endpoint)
|
||||
|
||||
### No Changes Needed
|
||||
- `src/ui/viewer/components/Header.tsx` (already receives boolean)
|
||||
- `src/ui/viewer/components/SummarySkeleton.tsx` (might be removed)
|
||||
- CSS/animations (work the same with boolean)
|
||||
|
||||
---
|
||||
|
||||
## Summary: What's Fucking Stupid
|
||||
|
||||
1. **Set<string> when we only need boolean** ← Biggest offender
|
||||
2. **No initial state on page load** ← Broken UX
|
||||
3. **Complex Set manipulation** ← 10+ lines for add/remove
|
||||
4. **Defensive cleanup in multiple places** ← No single source of truth
|
||||
5. **Session IDs in SSE events** ← Data we don't use
|
||||
6. **Worker doesn't maintain state** ← UI is source of truth (backwards!)
|
||||
|
||||
**Complexity Score**: 7/10 stupid
|
||||
|
||||
**After refactor**: 2/10 (the remaining complexity is React/SSE boilerplate)
|
||||
|
||||
---
|
||||
|
||||
## What Do You Want To Do?
|
||||
|
||||
Tell me:
|
||||
1. **Skeleton cards**: Keep (per-session), remove entirely, or show one generic skeleton?
|
||||
2. **Breaking changes**: OK to simplify now, or do you want backwards compatibility?
|
||||
3. **Implementation**: Want me to do Phase 1 (quick fix), Phase 2 (full refactor), or both?
|
||||
@@ -0,0 +1,564 @@
|
||||
# Processing Indicator: Complete Code Reference
|
||||
|
||||
This document provides a line-by-line breakdown of every piece of code related to the processing/activity indicator (the spinning logomark in the top left corner of the viewer UI).
|
||||
|
||||
## Overview
|
||||
|
||||
The processing indicator is a visual cue that shows when the worker service is actively processing memories (observations or summaries). It consists of:
|
||||
|
||||
1. **Logomark Image**: `claude-mem-logomark.webp` in the header
|
||||
2. **Spinning Animation**: Applied via CSS class when processing is active
|
||||
3. **State Management**: Tracked via Server-Sent Events (SSE) from the worker
|
||||
4. **Processing Sessions Set**: Maintains active session IDs being processed
|
||||
|
||||
## Data Flow
|
||||
|
||||
```
|
||||
Worker Service
|
||||
└─> broadcastProcessingStatus(sessionId, isProcessing)
|
||||
└─> broadcastSSE({ type: 'processing_status', ... })
|
||||
└─> SSE Event Stream (/stream)
|
||||
└─> useSSE Hook (React)
|
||||
└─> processingSessions Set<string>
|
||||
└─> App.tsx: isProcessing={processingSessions.size > 0}
|
||||
└─> Header.tsx: className={isProcessing ? 'spinning' : ''}
|
||||
└─> CSS Animation: @keyframes spin
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 1. TypeScript Types
|
||||
|
||||
### File: `src/ui/viewer/types.ts`
|
||||
|
||||
**Lines 45-58: StreamEvent interface with processing_status type**
|
||||
|
||||
```typescript
|
||||
export interface StreamEvent {
|
||||
type: 'initial_load' | 'new_observation' | 'new_summary' | 'new_prompt' | 'processing_status';
|
||||
observations?: Observation[];
|
||||
summaries?: Summary[];
|
||||
prompts?: UserPrompt[];
|
||||
projects?: string[];
|
||||
observation?: Observation;
|
||||
summary?: Summary;
|
||||
prompt?: UserPrompt;
|
||||
processing?: {
|
||||
session_id: string;
|
||||
is_processing: boolean;
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Defines the structure of SSE events. The `processing_status` type includes a `processing` object that indicates whether a session is currently being processed.
|
||||
|
||||
---
|
||||
|
||||
## 2. Worker Service (Backend)
|
||||
|
||||
### File: `src/services/worker-service.ts`
|
||||
|
||||
**Lines 247-272: broadcastSSE() - Core SSE broadcasting**
|
||||
|
||||
```typescript
|
||||
/**
|
||||
* Broadcast SSE event to all connected clients
|
||||
*/
|
||||
private broadcastSSE(event: any): void {
|
||||
if (this.sseClients.size === 0) {
|
||||
return; // No clients connected, skip broadcast
|
||||
}
|
||||
|
||||
const data = `data: ${JSON.stringify(event)}\n\n`;
|
||||
const clientsToRemove: Response[] = [];
|
||||
|
||||
for (const client of this.sseClients) {
|
||||
try {
|
||||
client.write(data);
|
||||
} catch (error) {
|
||||
// Client disconnected, mark for removal
|
||||
clientsToRemove.push(client);
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up disconnected clients
|
||||
for (const client of clientsToRemove) {
|
||||
this.sseClients.delete(client);
|
||||
}
|
||||
|
||||
if (clientsToRemove.length > 0) {
|
||||
logger.info('WORKER', `SSE cleaned up disconnected clients`, { count: clientsToRemove.length });
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Broadcasts SSE events to all connected UI clients. Handles disconnected clients gracefully.
|
||||
|
||||
---
|
||||
|
||||
**Lines 274-285: broadcastProcessingStatus() - Processing indicator control**
|
||||
|
||||
```typescript
|
||||
/**
|
||||
* Broadcast processing status to SSE clients
|
||||
*/
|
||||
private broadcastProcessingStatus(claudeSessionId: string, isProcessing: boolean): void {
|
||||
this.broadcastSSE({
|
||||
type: 'processing_status',
|
||||
processing: {
|
||||
session_id: claudeSessionId,
|
||||
is_processing: isProcessing
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Dedicated method for broadcasting processing status changes. Called when sessions start/stop processing.
|
||||
|
||||
---
|
||||
|
||||
**Line 817: Summarize request triggers processing start**
|
||||
|
||||
```typescript
|
||||
// Notify UI that processing is active
|
||||
this.broadcastProcessingStatus(session.claudeSessionId, true);
|
||||
```
|
||||
|
||||
**Context**: In `handleSummarize()` method - when a summary request is queued, processing starts.
|
||||
|
||||
**File location**: `src/services/worker-service.ts:817`
|
||||
|
||||
---
|
||||
|
||||
**Line 1153: Summary generation complete - processing stops**
|
||||
|
||||
```typescript
|
||||
// Notify UI that processing is complete (summary is the final step)
|
||||
this.broadcastProcessingStatus(session.claudeSessionId, false);
|
||||
```
|
||||
|
||||
**Context**: In `processSummarizeMessage()` after successfully generating and saving a summary.
|
||||
|
||||
**File location**: `src/services/worker-service.ts:1153`
|
||||
|
||||
---
|
||||
|
||||
**Line 1183: No summary generated - still mark processing complete**
|
||||
|
||||
```typescript
|
||||
// Still mark processing as complete even if no summary was generated
|
||||
this.broadcastProcessingStatus(session.claudeSessionId, false);
|
||||
```
|
||||
|
||||
**Context**: In `processSummarizeMessage()` when no summary tags are found in the AI response.
|
||||
|
||||
**File location**: `src/services/worker-service.ts:1183`
|
||||
|
||||
---
|
||||
|
||||
## 3. React Hook: SSE Connection
|
||||
|
||||
### File: `src/ui/viewer/hooks/useSSE.ts`
|
||||
|
||||
**Line 12: processingSessions state initialization**
|
||||
|
||||
```typescript
|
||||
const [processingSessions, setProcessingSessions] = useState<Set<string>>(new Set());
|
||||
```
|
||||
|
||||
**Purpose**: Maintains a Set of session IDs currently being processed. Used to determine if any processing is active.
|
||||
|
||||
---
|
||||
|
||||
**Lines 90-104: processing_status event handler**
|
||||
|
||||
```typescript
|
||||
case 'processing_status':
|
||||
if (data.processing) {
|
||||
const processing = data.processing;
|
||||
console.log('[SSE] Processing status:', processing);
|
||||
setProcessingSessions(prev => {
|
||||
const next = new Set(prev);
|
||||
if (processing.is_processing) {
|
||||
next.add(processing.session_id);
|
||||
} else {
|
||||
next.delete(processing.session_id);
|
||||
}
|
||||
return next;
|
||||
});
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
**Purpose**: Listens for `processing_status` SSE events and updates the processingSessions Set:
|
||||
- `is_processing: true` → Adds session ID to Set
|
||||
- `is_processing: false` → Removes session ID from Set
|
||||
|
||||
**File location**: `src/ui/viewer/hooks/useSSE.ts:90-104`
|
||||
|
||||
---
|
||||
|
||||
**Lines 73-78: Summary completion also clears processing status**
|
||||
|
||||
```typescript
|
||||
// Mark session as no longer processing (summary is the final step)
|
||||
setProcessingSessions(prev => {
|
||||
const next = new Set(prev);
|
||||
next.delete(summary.session_id);
|
||||
return next;
|
||||
});
|
||||
```
|
||||
|
||||
**Purpose**: When a `new_summary` event arrives, remove the session from processingSessions (defensive cleanup in case the processing_status event was missed).
|
||||
|
||||
**File location**: `src/ui/viewer/hooks/useSSE.ts:73-78`
|
||||
|
||||
---
|
||||
|
||||
**Line 125: Hook return value includes processingSessions**
|
||||
|
||||
```typescript
|
||||
return { observations, summaries, prompts, projects, processingSessions, isConnected };
|
||||
```
|
||||
|
||||
**Purpose**: Exposes processingSessions Set to consuming components.
|
||||
|
||||
---
|
||||
|
||||
## 4. React Component: App
|
||||
|
||||
### File: `src/ui/viewer/App.tsx`
|
||||
|
||||
**Line 20: Destructure processingSessions from useSSE**
|
||||
|
||||
```typescript
|
||||
const { observations, summaries, prompts, projects, processingSessions, isConnected } = useSSE();
|
||||
```
|
||||
|
||||
**Purpose**: Gets the processingSessions Set from the SSE hook.
|
||||
|
||||
---
|
||||
|
||||
**Line 92: Convert Set to boolean for Header component**
|
||||
|
||||
```typescript
|
||||
isProcessing={processingSessions.size > 0}
|
||||
```
|
||||
|
||||
**Purpose**: Passes `true` to Header if ANY session is being processed (Set has items), `false` otherwise.
|
||||
|
||||
**File location**: `src/ui/viewer/App.tsx:92`
|
||||
|
||||
---
|
||||
|
||||
## 5. React Component: Header
|
||||
|
||||
### File: `src/ui/viewer/components/Header.tsx`
|
||||
|
||||
**Line 12: isProcessing prop definition**
|
||||
|
||||
```typescript
|
||||
interface HeaderProps {
|
||||
isConnected: boolean;
|
||||
projects: string[];
|
||||
currentFilter: string;
|
||||
onFilterChange: (filter: string) => void;
|
||||
onSettingsToggle: () => void;
|
||||
sidebarOpen: boolean;
|
||||
isProcessing: boolean; // ← Processing indicator prop
|
||||
themePreference: ThemePreference;
|
||||
onThemeChange: (theme: ThemePreference) => void;
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Defines the isProcessing boolean prop for the Header component.
|
||||
|
||||
---
|
||||
|
||||
**Line 24: isProcessing destructured from props**
|
||||
|
||||
```typescript
|
||||
export function Header({
|
||||
isConnected,
|
||||
projects,
|
||||
currentFilter,
|
||||
onFilterChange,
|
||||
onSettingsToggle,
|
||||
sidebarOpen,
|
||||
isProcessing, // ← Received from App.tsx
|
||||
themePreference,
|
||||
onThemeChange
|
||||
}: HeaderProps) {
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Line 31: Logomark with conditional spinning class**
|
||||
|
||||
```typescript
|
||||
<img src="claude-mem-logomark.webp" alt="" className={`logomark ${isProcessing ? 'spinning' : ''}`} />
|
||||
```
|
||||
|
||||
**Purpose**: The core of the processing indicator. When `isProcessing` is `true`, adds the `spinning` CSS class to the logomark image, triggering the rotation animation.
|
||||
|
||||
**File location**: `src/ui/viewer/components/Header.tsx:31`
|
||||
|
||||
**Rendered HTML Examples**:
|
||||
- Not processing: `<img src="claude-mem-logomark.webp" alt="" className="logomark" />`
|
||||
- Processing: `<img src="claude-mem-logomark.webp" alt="" className="logomark spinning" />`
|
||||
|
||||
---
|
||||
|
||||
## 6. CSS Styling & Animation
|
||||
|
||||
### File: `plugin/ui/viewer.html` (compiled output)
|
||||
|
||||
**Lines 342-349: Logomark and spinning class styles**
|
||||
|
||||
```css
|
||||
.logomark {
|
||||
height: 32px;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.logomark.spinning {
|
||||
animation: spin 1.5s linear infinite;
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**:
|
||||
- `.logomark`: Base styles for the logo image (32px height, auto width)
|
||||
- `.logomark.spinning`: Applies the spin animation when processing is active
|
||||
- **Duration**: 1.5 seconds per rotation
|
||||
- **Timing**: Linear (constant speed)
|
||||
- **Iteration**: Infinite (continues until class is removed)
|
||||
|
||||
**File location**: `plugin/ui/viewer.html:342-349`
|
||||
|
||||
---
|
||||
|
||||
**Lines 701-705: Spin animation keyframes**
|
||||
|
||||
```css
|
||||
@keyframes spin {
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Defines the rotation animation. Rotates the element from 0° (implicit) to 360° (full circle).
|
||||
|
||||
**File location**: `plugin/ui/viewer.html:701-705`
|
||||
|
||||
---
|
||||
|
||||
## 7. API Endpoint: Stream
|
||||
|
||||
### File: `src/ui/viewer/constants/api.ts`
|
||||
|
||||
**Line 11: SSE stream endpoint**
|
||||
|
||||
```typescript
|
||||
export const API_ENDPOINTS = {
|
||||
OBSERVATIONS: '/api/observations',
|
||||
SUMMARIES: '/api/summaries',
|
||||
PROMPTS: '/api/prompts',
|
||||
SETTINGS: '/api/settings',
|
||||
STATS: '/api/stats',
|
||||
STREAM: '/stream', // ← SSE endpoint for processing events
|
||||
} as const;
|
||||
```
|
||||
|
||||
**Purpose**: Centralized API endpoint constant. The `/stream` endpoint is used by `useSSE.ts` to establish the EventSource connection.
|
||||
|
||||
---
|
||||
|
||||
## Bonus: Feed Skeleton Processing Indicator
|
||||
|
||||
While not part of the logomark spinner, the feed also shows processing state with skeleton cards and a smaller spinner.
|
||||
|
||||
### File: `src/ui/viewer/components/Feed.tsx`
|
||||
|
||||
**Lines 66-80: Create skeleton items for processing sessions**
|
||||
|
||||
```typescript
|
||||
// Create skeleton items for sessions being processed that don't have summaries yet
|
||||
const skeletons: FeedItem[] = [];
|
||||
processingSessions.forEach(sessionId => {
|
||||
if (!sessionsWithSummaries.has(sessionId)) {
|
||||
const prompt = sessionPrompts.get(sessionId);
|
||||
skeletons.push({
|
||||
itemType: 'skeleton',
|
||||
id: sessionId,
|
||||
session_id: sessionId,
|
||||
project: prompt?.project,
|
||||
// Always use current time so skeletons appear at top of feed
|
||||
created_at_epoch: Date.now()
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
**Purpose**: Creates temporary skeleton cards for sessions currently being processed (from `processingSessions` Set).
|
||||
|
||||
---
|
||||
|
||||
**Line 104: Render SummarySkeleton component**
|
||||
|
||||
```typescript
|
||||
} else if (item.itemType === 'skeleton') {
|
||||
return <SummarySkeleton key={key} sessionId={item.session_id} project={item.project} />;
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### File: `src/ui/viewer/components/SummarySkeleton.tsx`
|
||||
|
||||
**Lines 14-17: Processing indicator in skeleton card**
|
||||
|
||||
```typescript
|
||||
<div className="processing-indicator">
|
||||
<div className="spinner"></div>
|
||||
<span>Generating...</span>
|
||||
</div>
|
||||
```
|
||||
|
||||
**Purpose**: Shows a smaller inline spinner with "Generating..." text in skeleton summary cards.
|
||||
|
||||
---
|
||||
|
||||
### CSS for Feed Spinner
|
||||
|
||||
**Lines 682-690: Processing indicator container**
|
||||
|
||||
```css
|
||||
.processing-indicator {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
color: var(--color-accent-focus);
|
||||
font-size: 11px;
|
||||
font-weight: 500;
|
||||
margin-left: auto;
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Lines 692-700: Small spinner for skeleton cards**
|
||||
|
||||
```css
|
||||
.spinner {
|
||||
width: 12px;
|
||||
height: 12px;
|
||||
border: 2px solid var(--color-border-primary);
|
||||
border-top-color: var(--color-accent-focus);
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Smaller circular spinner (12px) with faster rotation (0.8s) used in skeleton cards. Uses the same `@keyframes spin` animation.
|
||||
|
||||
---
|
||||
|
||||
**Lines 711-715: Skeleton card opacity**
|
||||
|
||||
```css
|
||||
.summary-skeleton {
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.summary-skeleton .processing-indicator {
|
||||
margin-left: auto;
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Lines 715-740: Skeleton line animations (shimmer effect)**
|
||||
|
||||
```css
|
||||
.skeleton-line {
|
||||
height: 16px;
|
||||
background: linear-gradient(90deg, var(--color-skeleton-base) 25%, var(--color-skeleton-highlight) 50%, var(--color-skeleton-base) 75%);
|
||||
background-size: 200% 100%;
|
||||
animation: shimmer 1.5s infinite;
|
||||
border-radius: 4px;
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
.skeleton-title {
|
||||
height: 20px;
|
||||
width: 80%;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.skeleton-subtitle {
|
||||
height: 16px;
|
||||
width: 90%;
|
||||
}
|
||||
|
||||
.skeleton-subtitle.short {
|
||||
width: 60%;
|
||||
}
|
||||
|
||||
@keyframes shimmer {
|
||||
0% {
|
||||
background-position: 200% 0;
|
||||
}
|
||||
100% {
|
||||
background-position: -200% 0;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Purpose**: Creates animated placeholder lines with a shimmer effect while summary is being generated.
|
||||
|
||||
---
|
||||
|
||||
## Summary: Complete Processing Flow
|
||||
|
||||
1. **User submits prompt** → Claude Code session starts
|
||||
2. **Worker receives summarize request** → `worker-service.ts:817` calls `broadcastProcessingStatus(sessionId, true)`
|
||||
3. **SSE broadcasts** → `{ type: 'processing_status', processing: { session_id: '...', is_processing: true } }`
|
||||
4. **React receives event** → `useSSE.ts:90-104` adds sessionId to `processingSessions` Set
|
||||
5. **State flows down** → `App.tsx:92` converts Set size to boolean → `Header.tsx:31` receives `isProcessing={true}`
|
||||
6. **CSS class applied** → `className="logomark spinning"` triggers animation
|
||||
7. **Logomark spins** → CSS animation `@keyframes spin` rotates 360° every 1.5s
|
||||
8. **Feed shows skeleton** → `Feed.tsx:66-80` creates skeleton cards for processing sessions
|
||||
9. **Summary completes** → `worker-service.ts:1153` calls `broadcastProcessingStatus(sessionId, false)`
|
||||
10. **SSE broadcasts** → `{ type: 'processing_status', processing: { session_id: '...', is_processing: false } }`
|
||||
11. **React clears state** → `useSSE.ts:90-104` removes sessionId from Set
|
||||
12. **Animation stops** → `isProcessing={false}` removes `spinning` class, logomark stops rotating
|
||||
|
||||
---
|
||||
|
||||
## File Summary
|
||||
|
||||
| File | Lines | Purpose |
|
||||
|------|-------|---------|
|
||||
| `src/ui/viewer/types.ts` | 45-58 | Defines `StreamEvent` interface with `processing_status` type |
|
||||
| `src/services/worker-service.ts` | 247-285, 817, 1153, 1183 | Broadcasts processing status via SSE |
|
||||
| `src/ui/viewer/hooks/useSSE.ts` | 12, 73-78, 90-104, 125 | Manages `processingSessions` Set from SSE events |
|
||||
| `src/ui/viewer/App.tsx` | 20, 92 | Converts Set to boolean, passes to Header |
|
||||
| `src/ui/viewer/components/Header.tsx` | 12, 24, 31 | Applies `spinning` class to logomark |
|
||||
| `plugin/ui/viewer.html` (CSS) | 342-349, 701-705 | Styles logomark and defines spin animation |
|
||||
| `src/ui/viewer/components/Feed.tsx` | 66-80, 104 | Creates skeleton cards for processing sessions |
|
||||
| `src/ui/viewer/components/SummarySkeleton.tsx` | 14-17 | Renders inline spinner in skeleton cards |
|
||||
| `plugin/ui/viewer.html` (CSS) | 682-740 | Styles for skeleton cards and inline spinner |
|
||||
|
||||
---
|
||||
|
||||
## Key Design Decisions
|
||||
|
||||
1. **Set vs Boolean**: Using a `Set<string>` for `processingSessions` allows tracking multiple concurrent sessions. The UI shows spinning as long as *any* session is processing.
|
||||
|
||||
2. **Defensive Cleanup**: Both `processing_status` events AND `new_summary` events clear processing state, ensuring the spinner stops even if events arrive out of order.
|
||||
|
||||
3. **CSS-Only Animation**: No JavaScript animation loops - pure CSS transforms provide smooth, GPU-accelerated rotation with minimal performance impact.
|
||||
|
||||
4. **Dual Indicators**: Header logomark (global processing state) + skeleton cards (per-session processing state) provide both overview and detail-level feedback.
|
||||
|
||||
5. **SSE Architecture**: Server-Sent Events provide real-time updates without polling, keeping UI responsive with minimal network overhead.
|
||||
@@ -0,0 +1,303 @@
|
||||
# Worker Service Refactor Plan
|
||||
|
||||
**Date**: 2025-11-06
|
||||
**Based on**: worker-service-analysis.md
|
||||
**Branch**: cleanup/worker
|
||||
|
||||
---
|
||||
|
||||
## Decisions Made
|
||||
|
||||
### 🔥🔥🔥🔥🔥 Critical Fixes
|
||||
|
||||
#### Issue #1: Fragile PM2 String Parsing
|
||||
**Decision**: DELETE all PM2 status checking code
|
||||
- Remove lines 54-98 in worker-utils.ts (PM2 list parsing)
|
||||
- Replace with simple: health check → if unhealthy, restart → wait for health
|
||||
- PM2 restart is idempotent - handles "not started" and "started but broken"
|
||||
- Rationale: "Just ping localhost:37777" - if unhealthy, restart it
|
||||
|
||||
#### Issue #2: Silent PM2 Error Handling
|
||||
**Decision**: AUTOMATICALLY RESOLVED by Issue #1
|
||||
- Gets deleted with PM2 status checking code
|
||||
- New approach naturally fails fast on execSync
|
||||
|
||||
#### Issue #3: Session Auto-Creation Duplication
|
||||
**Decision**: EXTRACT to helper method
|
||||
- Create `private getOrCreateSession(sessionDbId): ActiveSession`
|
||||
- Remove 60+ lines of duplicated code from:
|
||||
- handleInit() (lines 663-733)
|
||||
- handleObservation() (lines 754-785)
|
||||
- handleSummarize() (lines 813-844)
|
||||
- Rationale: DRY principle
|
||||
|
||||
#### Issue #4: No "Running But Unhealthy" Handling
|
||||
**Decision**: AUTOMATICALLY RESOLVED by Issue #1
|
||||
- New approach always restarts if unhealthy
|
||||
- PM2 restart handles all cases
|
||||
|
||||
#### Issue #5: Useless getWorkerPort() Wrapper
|
||||
**Decision**: CREATE proper settings reader
|
||||
- Delete the wrapper function
|
||||
- Create settings reader that:
|
||||
1. Reads from `~/.claude-mem/settings.json`
|
||||
2. Falls back to `process.env.CLAUDE_MEM_WORKER_PORT`
|
||||
3. Falls back to `37777`
|
||||
- Rationale: UI writes to `~/.claude-mem/settings.json`, worker/hooks must read from there
|
||||
|
||||
---
|
||||
|
||||
### 🔥🔥🔥 Cleanup
|
||||
|
||||
#### Issue #6: 1500ms Debounce Too Long
|
||||
**Decision**: SKIP - not a concern
|
||||
|
||||
#### Issue #7: Magic Numbers Throughout
|
||||
**Decision**: DELETE unnecessary magic numbers, UNIFY required ones
|
||||
- Remove hardcoded defaults that aren't needed
|
||||
- Centralize remaining constants with named variables
|
||||
- Locations:
|
||||
- worker-utils.ts: timeout values (100ms, 1000ms, 10000ms)
|
||||
- worker-service.ts: Line 997 (100ms), Line 109 ('50mb'), etc.
|
||||
|
||||
#### Issue #8: Configuration Duplication
|
||||
**Decision**: AUTOMATICALLY RESOLVED by Issue #7
|
||||
- Centralizing constants solves this
|
||||
|
||||
#### Issue #9: Hardcoded Model Validation
|
||||
**Decision**: AUTOMATICALLY RESOLVED by Issue #7
|
||||
- Delete hardcoded model list
|
||||
- Let SDK handle validation
|
||||
|
||||
#### Issue #10: Hardcoded Version Fallback
|
||||
**Decision**: READ from package.json
|
||||
- Line 343: Replace `'5.0.3'` with dynamic read from package.json
|
||||
- Rationale: Why hardcode a version that gets stale?
|
||||
|
||||
#### Issue #11: Unnecessary this.port Instance Variable
|
||||
**Decision**: DELETE `this.port`
|
||||
- worker-service.ts:100 - remove instance variable
|
||||
- Replace all `this.port` uses with direct constant/settings reader
|
||||
- Used at lines 351, 738, 742
|
||||
|
||||
---
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Phase 1: worker-utils.ts Complete Rewrite
|
||||
|
||||
**File**: `src/shared/worker-utils.ts`
|
||||
|
||||
**Changes**:
|
||||
1. Create settings reader function:
|
||||
```typescript
|
||||
function getWorkerPort(): number {
|
||||
try {
|
||||
const settingsPath = join(homedir(), '.claude-mem', 'settings.json');
|
||||
if (existsSync(settingsPath)) {
|
||||
const settings = JSON.parse(readFileSync(settingsPath, 'utf-8'));
|
||||
const port = parseInt(settings.env?.CLAUDE_MEM_WORKER_PORT, 10);
|
||||
if (!isNaN(port)) return port;
|
||||
}
|
||||
} catch {}
|
||||
return parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
}
|
||||
```
|
||||
|
||||
2. Add named constants:
|
||||
```typescript
|
||||
const HEALTH_CHECK_TIMEOUT_MS = 100;
|
||||
const HEALTH_CHECK_POLL_INTERVAL_MS = 100;
|
||||
const HEALTH_CHECK_MAX_WAIT_MS = 10000;
|
||||
```
|
||||
|
||||
3. Simplify `ensureWorkerRunning()`:
|
||||
```typescript
|
||||
export async function ensureWorkerRunning(): Promise<void> {
|
||||
if (await isWorkerHealthy()) return;
|
||||
|
||||
const packageRoot = getPackageRoot();
|
||||
const pm2Path = path.join(packageRoot, "node_modules", ".bin", "pm2");
|
||||
const ecosystemPath = path.join(packageRoot, "ecosystem.config.cjs");
|
||||
|
||||
execSync(`"${pm2Path}" restart "${ecosystemPath}"`, {
|
||||
cwd: packageRoot,
|
||||
stdio: 'pipe'
|
||||
});
|
||||
|
||||
if (!await waitForWorkerHealth()) {
|
||||
throw new Error("Worker failed to become healthy after restart");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
4. Update `isWorkerHealthy()` and `waitForWorkerHealth()` to use constants
|
||||
|
||||
**Result**: ~50 lines (vs 110 original), all bugs fixed
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: worker-service.ts Cleanup
|
||||
|
||||
**File**: `src/services/worker-service.ts`
|
||||
|
||||
**Changes**:
|
||||
|
||||
1. **Read version from package.json** (line 343):
|
||||
```typescript
|
||||
import { readFileSync } from 'fs';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = dirname(__filename);
|
||||
const packageJson = JSON.parse(readFileSync(join(__dirname, '../../package.json'), 'utf-8'));
|
||||
const VERSION = packageJson.version;
|
||||
```
|
||||
|
||||
2. **Extract getOrCreateSession() helper**:
|
||||
```typescript
|
||||
private getOrCreateSession(sessionDbId: number): ActiveSession {
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (session) return session;
|
||||
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
if (!dbSession) {
|
||||
db.close();
|
||||
throw new Error(`Session ${sessionDbId} not found in database`);
|
||||
}
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession.project,
|
||||
userPrompt: dbSession.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
|
||||
db.close();
|
||||
return session;
|
||||
}
|
||||
```
|
||||
|
||||
3. **Update handleInit(), handleObservation(), handleSummarize()**:
|
||||
Replace duplication with single line:
|
||||
```typescript
|
||||
const session = this.getOrCreateSession(sessionDbId);
|
||||
```
|
||||
|
||||
4. **Delete model validation** (lines 407+):
|
||||
Remove hardcoded validModels array and validation check
|
||||
|
||||
5. **Delete this.port instance variable** (line 100):
|
||||
- Remove `private port: number = FIXED_PORT;`
|
||||
- Replace all `this.port` references with `FIXED_PORT` or settings reader
|
||||
|
||||
6. **Add named constants** at top of file:
|
||||
```typescript
|
||||
const MESSAGE_POLL_INTERVAL_MS = 100;
|
||||
const MAX_REQUEST_SIZE = '50mb';
|
||||
```
|
||||
|
||||
7. **Use named constants** throughout (lines 109, 997, etc.)
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Update Hooks
|
||||
|
||||
**Files**:
|
||||
- `src/hooks/new-hook.ts`
|
||||
- `src/hooks/save-hook.ts`
|
||||
- `src/hooks/summary-hook.ts`
|
||||
- `src/hooks/cleanup-hook.ts`
|
||||
|
||||
**Changes**:
|
||||
1. Import settings reader from worker-utils
|
||||
2. Replace `const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);`
|
||||
with call to settings reader
|
||||
3. Update cleanup-hook.ts line 74 to use settings reader as fallback
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Update user-message-hook.ts
|
||||
|
||||
**File**: `src/hooks/user-message-hook.ts`
|
||||
|
||||
**Changes**:
|
||||
- Line 53: Replace hardcoded `http://localhost:37777/` with dynamic port from settings reader
|
||||
|
||||
---
|
||||
|
||||
## Files Changed
|
||||
|
||||
1. `src/shared/worker-utils.ts` - Complete rewrite (~50 lines)
|
||||
2. `src/services/worker-service.ts` - Major cleanup (remove ~60 lines duplication, add helper)
|
||||
3. `src/hooks/new-hook.ts` - Use settings reader
|
||||
4. `src/hooks/save-hook.ts` - Use settings reader
|
||||
5. `src/hooks/summary-hook.ts` - Use settings reader
|
||||
6. `src/hooks/cleanup-hook.ts` - Use settings reader
|
||||
7. `src/hooks/user-message-hook.ts` - Dynamic port in message
|
||||
|
||||
---
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
After implementation:
|
||||
|
||||
- [ ] Build: `npm run build`
|
||||
- [ ] Sync: `npm run sync-marketplace`
|
||||
- [ ] Restart worker: `npm run worker:restart`
|
||||
- [ ] Start new Claude Code session (hooks should work)
|
||||
- [ ] Change port in UI settings to 38888
|
||||
- [ ] Restart worker
|
||||
- [ ] Verify worker binds to 38888
|
||||
- [ ] Verify hooks connect to 38888
|
||||
- [ ] Verify UI connects to 38888
|
||||
- [ ] Change port back to 37777
|
||||
- [ ] Test all endpoints work
|
||||
|
||||
---
|
||||
|
||||
## Expected Outcomes
|
||||
|
||||
**Lines Removed**: ~130 lines (60 from duplication, 70 from PM2 parsing)
|
||||
**Lines Added**: ~50 lines (helper method, settings reader, constants)
|
||||
**Net Change**: -80 lines
|
||||
|
||||
**Bugs Fixed**:
|
||||
- ✅ PM2 string parsing false positives
|
||||
- ✅ Silent error handling
|
||||
- ✅ No restart when unhealthy
|
||||
- ✅ Port configuration not synchronized with UI
|
||||
|
||||
**Code Quality**:
|
||||
- ✅ DRY principle applied (no duplication)
|
||||
- ✅ YAGNI principle applied (removed ceremony)
|
||||
- ✅ Fail fast error handling
|
||||
- ✅ Named constants instead of magic numbers
|
||||
- ✅ Single source of truth for configuration
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- This plan addresses all Severity 5 and Severity 4 issues from the analysis
|
||||
- Skipped Severity 2 issues that aren't actual problems (debounce timing)
|
||||
- All "automatically resolved" issues are covered by the main fixes
|
||||
- Settings synchronization bug (port not working) is now fixed
|
||||
@@ -0,0 +1,907 @@
|
||||
# Worker Service & Worker Utils: Comprehensive YAGNI Analysis
|
||||
|
||||
**Date**: 2025-11-06
|
||||
**Files Analyzed**:
|
||||
- `src/services/worker-service.ts` (1228 lines)
|
||||
- `src/shared/worker-utils.ts` (110 lines)
|
||||
|
||||
**Overall Assessment**: 80% excellent architecture, 20% cleanup needed. Worker-service is well-structured with proper error handling priorities, but worker-utils contains critical bugs and YAGNI violations.
|
||||
|
||||
---
|
||||
|
||||
## Executive Summary
|
||||
|
||||
### What These Files Do
|
||||
|
||||
**worker-service.ts**: Long-running Express HTTP service managed by PM2. Handles AI compression of observations, session management, SSE streaming for web UI, and Chroma vector sync. This is the heart of claude-mem's async processing.
|
||||
|
||||
**worker-utils.ts**: Utilities for ensuring the worker is running. Called by hooks at session start to verify/start the PM2 worker process.
|
||||
|
||||
### Critical Findings
|
||||
|
||||
#### 🔥🔥🔥🔥🔥 SEVERITY 5 - MUST FIX IMMEDIATELY
|
||||
|
||||
1. **worker-utils.ts:75** - Fragile string parsing of PM2 output causes false positives
|
||||
2. **worker-service.ts:754-844** - 60+ lines of identical session auto-creation code duplicated 3 times
|
||||
3. **worker-utils.ts:70** - Silent error handling defers PM2 failures instead of failing fast
|
||||
|
||||
#### 🔥🔥🔥 SEVERITY 3 - FIX SOON
|
||||
|
||||
4. **worker-utils.ts:77-95** - No handling for "running but unhealthy" case
|
||||
5. **worker-utils.ts:107-109** - Useless `getWorkerPort()` wrapper function
|
||||
6. **worker-service.ts:316** - 1500ms debounce is 10x too long
|
||||
|
||||
#### 🔥🔥 SEVERITY 2 - CLEANUP WHEN CONVENIENT
|
||||
|
||||
7. Multiple magic numbers (100ms, 1000ms, 10000ms) without named constants
|
||||
8. Hardcoded default values duplicated across multiple locations
|
||||
9. Hardcoded model validation list that will become stale
|
||||
|
||||
---
|
||||
|
||||
## Complete Function Catalog
|
||||
|
||||
### worker-utils.ts Functions
|
||||
|
||||
| Function | Lines | Purpose | Status |
|
||||
|----------|-------|---------|--------|
|
||||
| `isWorkerHealthy(timeoutMs)` | 10-19 | Check /health endpoint responds | ✅ OK |
|
||||
| `waitForWorkerHealth(maxWaitMs)` | 24-36 | Poll until worker healthy | 🔥 Inefficient timeout |
|
||||
| `ensureWorkerRunning()` | 43-102 | Main orchestrator to start worker | 🔥🔥🔥🔥🔥 CRITICAL BUGS |
|
||||
| `getWorkerPort()` | 107-109 | Returns FIXED_PORT constant | 🔥🔥🔥🔥🔥 DELETE THIS |
|
||||
|
||||
### worker-service.ts Functions
|
||||
|
||||
| Function | Lines | Purpose | Status |
|
||||
|----------|-------|---------|--------|
|
||||
| `findClaudePath()` | 35-65 | Find Claude Code executable | ✅ Excellent |
|
||||
| Constructor | 107-139 | Setup Express routes | ✅ Good |
|
||||
| `start()` | 141-173 | Start HTTP server, init Chroma | ✅ Excellent prioritization |
|
||||
| `getUIDirectory()` | 178-189 | Get UI path (CJS/ESM) | ✅ Good defensive code |
|
||||
| `handleHealth()` | 194-196 | GET /health | ✅ PERFECT |
|
||||
| `handleViewerHTML()` | 201-211 | GET / | ✅ Good |
|
||||
| `handleSSEStream()` | 216-245 | GET /stream (SSE) | ✅ Good |
|
||||
| `broadcastSSE()` | 250-275 | Broadcast to clients | ✅ Excellent defensive code |
|
||||
| `broadcastProcessingStatus()` | 280-286 | Broadcast processing state | ✅ Good |
|
||||
| `checkAndStopSpinner()` | 291-318 | Debounced spinner stop | 🔥 1500ms too long |
|
||||
| `handleStats()` | 323-365 | GET /api/stats | 🔥 Hardcoded paths/version |
|
||||
| `handleGetSettings()` | 370-397 | GET /api/settings | 🔥 Duplicated defaults |
|
||||
| `handlePostSettings()` | 402-461 | POST /api/settings | 🔥 Hardcoded model list |
|
||||
| `handleGetObservations()` | 467-515 | GET /api/observations | ✅ Excellent |
|
||||
| `handleGetSummaries()` | 517-576 | GET /api/summaries | ✅ Excellent |
|
||||
| `handleGetPrompts()` | 578-631 | GET /api/prompts | ✅ Excellent |
|
||||
| `handleGetProcessingStatus()` | 637-639 | GET /api/processing-status | ✅ Good |
|
||||
| `handleInit()` | 645-744 | POST /sessions/:id/init | ✅ Good but has duplication |
|
||||
| `handleObservation()` | 750-803 | POST /sessions/:id/observations | 🔥🔥🔥🔥🔥 MASSIVE DUPLICATION |
|
||||
| `handleSummarize()` | 809-858 | POST /sessions/:id/summarize | 🔥🔥🔥🔥🔥 MASSIVE DUPLICATION |
|
||||
| `handleComplete()` | 864-873 | POST /sessions/:id/complete | ✅ PERFECT |
|
||||
| `handleStatus()` | 878-893 | GET /sessions/:id/status | ✅ Good |
|
||||
| `runSDKAgent()` | 898-963 | Run SDK agent loop | ✅ Excellent |
|
||||
| `createMessageGenerator()` | 969-1060 | Async generator for SDK | ✅ Excellent |
|
||||
| `handleAgentMessage()` | 1066-1201 | Parse and store AI response | ✅ EXCELLENT |
|
||||
| `main()` | 1205-1225 | Entry point + signals | ✅ Good |
|
||||
|
||||
---
|
||||
|
||||
## Line-by-Line Analysis
|
||||
|
||||
### worker-utils.ts
|
||||
|
||||
#### Lines 1-5: Imports and Constants
|
||||
```typescript
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || "37777", 10);
|
||||
```
|
||||
|
||||
**What**: Parse port from env var with fallback to 37777
|
||||
**Why**: Need to know which port to connect to
|
||||
**Critique**: ✅ Good - simple constant, no unnecessary abstraction
|
||||
|
||||
---
|
||||
|
||||
#### Lines 10-19: `isWorkerHealthy(timeoutMs = 100)`
|
||||
|
||||
```typescript
|
||||
async function isWorkerHealthy(timeoutMs: number = 100): Promise<boolean> {
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/health`, {
|
||||
signal: AbortSignal.timeout(timeoutMs)
|
||||
});
|
||||
return response.ok;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**What**: Checks if /health endpoint responds within timeout
|
||||
**Why**: Need to know if worker is running before trying to start it
|
||||
**Critique**:
|
||||
- Default 100ms is used once (line 45 initial check)
|
||||
- Explicit 1000ms passed at line 29 (during startup polling)
|
||||
- This inconsistency is actually INTENTIONAL: quick initial check vs. waiting for startup
|
||||
- ✅ **VERDICT**: Reasonable pattern
|
||||
|
||||
**Why the two timeouts?**
|
||||
- 100ms: "Is it already running?" (fast check, don't wait)
|
||||
- 1000ms: "Is it starting up?" (wait for initialization)
|
||||
|
||||
---
|
||||
|
||||
#### Lines 24-36: `waitForWorkerHealth(maxWaitMs = 10000)`
|
||||
|
||||
```typescript
|
||||
async function waitForWorkerHealth(maxWaitMs: number = 10000): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
const checkInterval = 100; // Check every 100ms
|
||||
|
||||
while (Date.now() - start < maxWaitMs) {
|
||||
if (await isWorkerHealthy(1000)) {
|
||||
return true;
|
||||
}
|
||||
// Wait before next check
|
||||
await new Promise(resolve => setTimeout(resolve, checkInterval));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
```
|
||||
|
||||
**What**: Polls health endpoint every 100ms until healthy or timeout
|
||||
**Why**: Worker takes time to start, need to wait
|
||||
**Critique**:
|
||||
|
||||
🔥 **MAGIC NUMBER #1**: Line 26 `checkInterval = 100` - no units! Is this milliseconds? Should be `CHECK_INTERVAL_MS = 100`
|
||||
|
||||
🔥 **MAGIC NUMBER #2**: Line 29 `isWorkerHealthy(1000)` - why 1000ms timeout per check?
|
||||
|
||||
🔥 **INEFFICIENCY**: Each health check has 1000ms timeout, but we check every 100ms. If the worker is down, each check waits 1000ms to timeout. We could fail faster with a 100ms timeout since we retry quickly anyway.
|
||||
|
||||
**The Math**:
|
||||
- Check interval: 100ms
|
||||
- Health timeout: 1000ms
|
||||
- If worker is down, first check fails after 1000ms, then we wait 100ms, then try again
|
||||
- Total time to detect "worker is down" on first check: 1000ms (could be 100ms)
|
||||
|
||||
**RECOMMENDED**: Use 100ms timeout for health checks since we retry every 100ms anyway:
|
||||
```typescript
|
||||
const HEALTH_CHECK_TIMEOUT_MS = 100;
|
||||
const HEALTH_CHECK_POLL_INTERVAL_MS = 100;
|
||||
const HEALTH_CHECK_MAX_WAIT_MS = 10000;
|
||||
|
||||
async function waitForWorkerHealth(): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
while (Date.now() - start < HEALTH_CHECK_MAX_WAIT_MS) {
|
||||
if (await isWorkerHealthy(HEALTH_CHECK_TIMEOUT_MS)) return true;
|
||||
await new Promise(resolve => setTimeout(resolve, HEALTH_CHECK_POLL_INTERVAL_MS));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Lines 43-102: `ensureWorkerRunning()` - 🔥🔥🔥🔥🔥 THE DISASTER ZONE
|
||||
|
||||
```typescript
|
||||
export async function ensureWorkerRunning(): Promise<void> {
|
||||
// First, check if worker is already healthy
|
||||
if (await isWorkerHealthy()) {
|
||||
return; // Worker is already running and responsive
|
||||
}
|
||||
|
||||
const packageRoot = getPackageRoot();
|
||||
const pm2Path = path.join(packageRoot, "node_modules", ".bin", "pm2");
|
||||
const ecosystemPath = path.join(packageRoot, "ecosystem.config.cjs");
|
||||
|
||||
// Check PM2 status to see if worker process exists
|
||||
const checkProcess = spawn(pm2Path, ["list", "--no-color"], {
|
||||
cwd: packageRoot,
|
||||
stdio: ["ignore", "pipe", "ignore"],
|
||||
});
|
||||
|
||||
let output = "";
|
||||
checkProcess.stdout?.on("data", (data) => {
|
||||
output += data.toString();
|
||||
});
|
||||
|
||||
// Wait for PM2 list to complete
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
checkProcess.on("error", (error) => reject(error));
|
||||
checkProcess.on("close", (code) => {
|
||||
// PM2 list can fail, but we should still continue - just assume worker isn't running
|
||||
// This handles cases where PM2 isn't installed yet
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
|
||||
// Check if 'claude-mem-worker' is in the PM2 list output and is 'online'
|
||||
const isRunning = output.includes("claude-mem-worker") && output.includes("online");
|
||||
|
||||
if (!isRunning) {
|
||||
// Start the worker
|
||||
const startProcess = spawn(pm2Path, ["start", ecosystemPath], {
|
||||
cwd: packageRoot,
|
||||
stdio: "ignore",
|
||||
});
|
||||
|
||||
// Wait for PM2 start command to complete
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
startProcess.on("error", (error) => reject(error));
|
||||
startProcess.on("close", (code) => {
|
||||
if (code !== 0 && code !== null) {
|
||||
reject(new Error(`PM2 start command failed with exit code ${code}`));
|
||||
} else {
|
||||
resolve();
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Wait for worker to become healthy (either just started or was starting)
|
||||
const healthy = await waitForWorkerHealth(10000);
|
||||
if (!healthy) {
|
||||
throw new Error("Worker failed to become healthy after starting");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**What**: Ensure PM2 worker is running - check health, check PM2 status, start if needed, wait for health
|
||||
**Why**: Hooks need worker running to process observations
|
||||
|
||||
#### 🔥🔥🔥🔥🔥 CRITICAL BUG #1: Fragile String Parsing (Line 75)
|
||||
|
||||
```typescript
|
||||
const isRunning = output.includes("claude-mem-worker") && output.includes("online");
|
||||
```
|
||||
|
||||
**THE PROBLEM**: This checks if BOTH strings exist ANYWHERE in the output. This is WRONG.
|
||||
|
||||
**Counter-Example**:
|
||||
```
|
||||
PM2 Process List:
|
||||
┌─────┬────────────────────┬─────────┐
|
||||
│ id │ name │ status │
|
||||
├─────┼────────────────────┼─────────┤
|
||||
│ 0 │ claude-mem-worker │ stopped │
|
||||
│ 1 │ some-other-app │ online │
|
||||
└─────┴────────────────────┴─────────┘
|
||||
```
|
||||
|
||||
This would return `true` because output contains "claude-mem-worker" AND "online", even though the worker is STOPPED!
|
||||
|
||||
**Impact**:
|
||||
- False positive: Worker is stopped, but code thinks it's running
|
||||
- Result: Skip starting worker (line 77 `if (!isRunning)`), wait for health
|
||||
- Health check fails because worker isn't actually running
|
||||
- Entire function fails with "Worker failed to become healthy"
|
||||
- User sees cryptic error instead of "Worker is stopped, restarting..."
|
||||
|
||||
**THE FIX**: Use PM2's JSON output
|
||||
```typescript
|
||||
const result = execSync(`"${pm2Path}" jlist`, { encoding: 'utf8' });
|
||||
const processes = JSON.parse(result);
|
||||
const worker = processes.find(p => p.name === 'claude-mem-worker');
|
||||
const isRunning = worker?.pm2_env?.status === 'online';
|
||||
```
|
||||
|
||||
#### 🔥🔥🔥🔥🔥 CRITICAL BUG #2: Silent Error Handling (Lines 65-72)
|
||||
|
||||
```typescript
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
checkProcess.on("error", (error) => reject(error));
|
||||
checkProcess.on("close", (code) => {
|
||||
// PM2 list can fail, but we should still continue - just assume worker isn't running
|
||||
// This handles cases where PM2 isn't installed yet
|
||||
resolve(); // ← ALWAYS RESOLVES, NEVER REJECTS
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
**THE PROBLEM**:
|
||||
1. If PM2 isn't installed, `pm2 list` fails
|
||||
2. Line 70: ALWAYS resolves, ignoring the failure
|
||||
3. `output` is empty string
|
||||
4. Line 75: `isRunning = false` (correct by accident)
|
||||
5. Line 77-94: Try to START the worker... which will ALSO fail because PM2 isn't installed
|
||||
6. Line 85-93: THIS finally rejects with error
|
||||
|
||||
**Why This Is Terrible**:
|
||||
- Defers error detection to the start command instead of failing fast
|
||||
- Confusing error message: "PM2 start command failed" instead of "PM2 not found - run npm install"
|
||||
- User wastes time waiting for PM2 list to fail, then waiting for PM2 start to fail
|
||||
- The comment is a LIE: "we should still continue" - no, we shouldn't! If PM2 isn't installed, FAIL IMMEDIATELY.
|
||||
|
||||
**THE FIX**: Fail fast
|
||||
```typescript
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
checkProcess.on("error", reject);
|
||||
checkProcess.on("close", (code) => {
|
||||
if (code !== 0 && code !== null) {
|
||||
reject(new Error(`PM2 not found - install dependencies first (npm install)`));
|
||||
}
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
#### 🔥🔥🔥🔥 CRITICAL BUG #3: No Handling for "Running But Unhealthy" (Lines 77-98)
|
||||
|
||||
**THE LOGIC**:
|
||||
1. Line 45: Check if worker is healthy → NO (or we would have returned)
|
||||
2. Line 54-75: Check if PM2 says worker is running
|
||||
3. Line 77: `if (!isRunning)` → start the worker
|
||||
4. Line 98: Wait for worker to become healthy
|
||||
|
||||
**THE PROBLEM**: What if PM2 says worker IS running but our health check (line 45) failed?
|
||||
|
||||
**Answer**: We do NOTHING. We skip the `if (!isRunning)` block and jump straight to line 98, waiting for it to become healthy.
|
||||
|
||||
**Why This Is Wrong**: If the worker is started but unhealthy, it won't magically heal itself. It needs to be RESTARTED.
|
||||
|
||||
**Scenarios**:
|
||||
- Worker crashed but PM2 hasn't noticed yet → Status: "online", Health: failed → We wait forever
|
||||
- Worker is in infinite loop → Status: "online", Health: timeout → We wait forever
|
||||
- Worker port is wrong → Status: "online", Health: failed → We wait forever
|
||||
|
||||
**THE FIX**: Restart if unhealthy
|
||||
```typescript
|
||||
if (!await isWorkerHealthy()) {
|
||||
// Not healthy - restart it (PM2 restart is idempotent)
|
||||
execSync(`"${pm2Path}" restart "${ecosystemPath}"`);
|
||||
if (!await waitForWorkerHealth()) {
|
||||
throw new Error("Worker failed to become healthy after restart");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Or even simpler: Just always restart if health fails. PM2 handles "not started" vs "started" gracefully.
|
||||
|
||||
---
|
||||
|
||||
#### Lines 107-109: `getWorkerPort()` - 🔥🔥🔥🔥🔥 DELETE THIS
|
||||
|
||||
```typescript
|
||||
/**
|
||||
* Get the worker port number (fixed port)
|
||||
*/
|
||||
export function getWorkerPort(): number {
|
||||
return FIXED_PORT;
|
||||
}
|
||||
```
|
||||
|
||||
**What**: Returns the FIXED_PORT constant
|
||||
**Why**: ???
|
||||
**Critique**: 🔥🔥🔥🔥🔥 **TEXTBOOK YAGNI VIOLATION**
|
||||
|
||||
This is the "wrapper function for a constant" anti-pattern from CLAUDE.md.
|
||||
|
||||
**THE PROBLEM**: This function adds ZERO value. It's pure ceremony.
|
||||
|
||||
**Callers should just**:
|
||||
```typescript
|
||||
import { FIXED_PORT } from './worker-utils.js';
|
||||
// Use FIXED_PORT directly
|
||||
```
|
||||
|
||||
**Instead of**:
|
||||
```typescript
|
||||
import { getWorkerPort } from './worker-utils.js';
|
||||
const port = getWorkerPort(); // Why???
|
||||
```
|
||||
|
||||
**Why This Exists**: Training bias. Code that looks "professional" often includes ceremonial getters for constants. But this is WRONG. Delete it and export the constant.
|
||||
|
||||
**THE FIX**:
|
||||
```typescript
|
||||
export const WORKER_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || "37777", 10);
|
||||
```
|
||||
|
||||
Then update all callers to use `WORKER_PORT` instead of `getWorkerPort()`.
|
||||
|
||||
---
|
||||
|
||||
### worker-utils.ts COMPLETE REWRITE
|
||||
|
||||
Here's what this file SHOULD be:
|
||||
|
||||
```typescript
|
||||
import path from "path";
|
||||
import { execSync } from "child_process";
|
||||
import { getPackageRoot } from "./paths.js";
|
||||
|
||||
// Configuration
|
||||
export const WORKER_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || "37777", 10);
|
||||
|
||||
const HEALTH_CHECK_TIMEOUT_MS = 100;
|
||||
const HEALTH_CHECK_POLL_INTERVAL_MS = 100;
|
||||
const HEALTH_CHECK_MAX_WAIT_MS = 10000;
|
||||
|
||||
/**
|
||||
* Check if worker is responsive by trying the health endpoint
|
||||
*/
|
||||
async function isWorkerHealthy(): Promise<boolean> {
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${WORKER_PORT}/health`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
return response.ok;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for worker to become healthy, polling every 100ms
|
||||
*/
|
||||
async function waitForWorkerHealth(): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
while (Date.now() - start < HEALTH_CHECK_MAX_WAIT_MS) {
|
||||
if (await isWorkerHealthy()) return true;
|
||||
await new Promise(resolve => setTimeout(resolve, HEALTH_CHECK_POLL_INTERVAL_MS));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure worker service is running and healthy
|
||||
* Restarts worker if not healthy (PM2 restart is idempotent)
|
||||
*/
|
||||
export async function ensureWorkerRunning(): Promise<void> {
|
||||
if (await isWorkerHealthy()) return;
|
||||
|
||||
const packageRoot = getPackageRoot();
|
||||
const pm2Path = path.join(packageRoot, "node_modules", ".bin", "pm2");
|
||||
const ecosystemPath = path.join(packageRoot, "ecosystem.config.cjs");
|
||||
|
||||
// PM2 restart is idempotent - handles both "not started" and "started but broken"
|
||||
try {
|
||||
const result = execSync(`"${pm2Path}" restart "${ecosystemPath}"`, {
|
||||
cwd: packageRoot,
|
||||
encoding: 'utf8',
|
||||
stdio: 'pipe'
|
||||
});
|
||||
|
||||
if (!await waitForWorkerHealth()) {
|
||||
throw new Error(`Worker failed to become healthy. PM2 output:\n${result}`);
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (error.code === 'ENOENT' || error.message.includes('not found')) {
|
||||
throw new Error('PM2 not found - run: npm install');
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Line Count**: 43 lines (vs 110 original)
|
||||
**Complexity**: 1/3 of original
|
||||
**Bugs Fixed**: All of them
|
||||
**Ceremony Removed**: All of it
|
||||
|
||||
**What Changed**:
|
||||
1. Removed `getWorkerPort()` wrapper - export constant directly
|
||||
2. Removed PM2 status checking - just restart if unhealthy
|
||||
3. Removed string parsing - use PM2's idempotent restart
|
||||
4. Removed silent error handling - fail fast on PM2 not found
|
||||
5. Named all magic numbers as constants
|
||||
6. Simplified to: "Unhealthy? Restart. Wait for health. Done."
|
||||
|
||||
---
|
||||
|
||||
## worker-service.ts Analysis
|
||||
|
||||
### Overall Structure
|
||||
|
||||
**Lines 1-24**: Imports and constants ✅
|
||||
**Lines 27-65**: `findClaudePath()` ✅ Excellent
|
||||
**Lines 67-96**: Type definitions ✅
|
||||
**Lines 98-1228**: WorkerService class
|
||||
|
||||
### Critical Issues in worker-service.ts
|
||||
|
||||
#### 🔥🔥🔥🔥🔥 ISSUE #1: Massive Code Duplication (Lines 754-844)
|
||||
|
||||
**THE PROBLEM**: Session auto-creation logic is COPIED THREE TIMES:
|
||||
1. `handleInit()` (lines 663-733)
|
||||
2. `handleObservation()` (lines 754-785)
|
||||
3. `handleSummarize()` (lines 813-844)
|
||||
|
||||
**The Duplicated Code** (20+ lines per copy):
|
||||
```typescript
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (!session) {
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
db.close();
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession!.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession!.project,
|
||||
userPrompt: dbSession!.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
**Impact**: 60+ lines of duplicated code across 3 functions
|
||||
|
||||
**THE FIX**: Extract to helper method
|
||||
```typescript
|
||||
private getOrCreateSession(sessionDbId: number): ActiveSession {
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (session) return session;
|
||||
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
if (!dbSession) {
|
||||
db.close();
|
||||
throw new Error(`Session ${sessionDbId} not found in database`);
|
||||
}
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession.project,
|
||||
userPrompt: dbSession.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
// Start SDK agent in background
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
|
||||
db.close();
|
||||
return session;
|
||||
}
|
||||
```
|
||||
|
||||
Then all three functions become:
|
||||
```typescript
|
||||
private handleObservation(req: Request, res: Response): void {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { tool_name, tool_input, tool_output, prompt_number } = req.body;
|
||||
|
||||
const session = this.getOrCreateSession(sessionDbId);
|
||||
|
||||
session.pendingMessages.push({
|
||||
type: 'observation',
|
||||
tool_name,
|
||||
tool_input,
|
||||
tool_output,
|
||||
prompt_number
|
||||
});
|
||||
|
||||
res.json({ status: 'queued', queueLength: session.pendingMessages.length });
|
||||
}
|
||||
```
|
||||
|
||||
**Savings**: Remove 60 lines, improve maintainability 10x
|
||||
|
||||
---
|
||||
|
||||
#### 🔥🔥 ISSUE #2: Magic Numbers Throughout
|
||||
|
||||
**Line 316**: `setTimeout(() => { ... }, 1500);` - Why 1500ms debounce?
|
||||
**Line 997**: `setTimeout(resolve, 100)` - Why 100ms polling?
|
||||
**Line 343**: `const version = process.env.npm_package_version || '5.0.3';` - Hardcoded fallback
|
||||
**Line 109**: `express.json({ limit: '50mb' })` - Why 50mb?
|
||||
|
||||
**THE FIX**: Named constants
|
||||
```typescript
|
||||
const SPINNER_DEBOUNCE_MS = 200; // Debounce spinner to prevent flicker
|
||||
const MESSAGE_POLL_INTERVAL_MS = 100; // Check for new messages every 100ms
|
||||
const MAX_REQUEST_SIZE = '50mb'; // Allow large tool outputs
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### 🔥🔥 ISSUE #3: Configuration Duplication
|
||||
|
||||
Default values appear in multiple places:
|
||||
- Line 377-380: Default settings in GET handler
|
||||
- Line 22: MODEL default
|
||||
- Throughout: Port defaults, observation count defaults
|
||||
|
||||
**THE FIX**: Centralize
|
||||
```typescript
|
||||
export const DEFAULT_CONFIG = {
|
||||
MODEL: 'claude-haiku-4-5',
|
||||
CONTEXT_OBSERVATIONS: 50,
|
||||
WORKER_PORT: 37777,
|
||||
VALID_MODELS: ['claude-haiku-4-5', 'claude-sonnet-4-5', 'claude-opus-4'],
|
||||
MAX_CONTEXT_OBSERVATIONS: 200,
|
||||
MIN_PORT: 1024,
|
||||
MAX_PORT: 65535
|
||||
} as const;
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### 🔥 ISSUE #4: Hardcoded Model Validation (Line 407)
|
||||
|
||||
```typescript
|
||||
const validModels = ['claude-haiku-4-5', 'claude-sonnet-4-5', 'claude-opus-4'];
|
||||
```
|
||||
|
||||
**THE PROBLEM**: This list will get stale when new models are released.
|
||||
|
||||
**YAGNI QUESTION**: Do we even need to validate? The SDK will error if model doesn't exist.
|
||||
|
||||
**ANSWER**: Better error messages for users. But this should be a WARNING, not a blocker.
|
||||
|
||||
**THE FIX**: Remove validation or make it advisory
|
||||
```typescript
|
||||
// Let SDK handle validation - it knows the current model list
|
||||
// We don't need to duplicate that logic here
|
||||
if (CLAUDE_MEM_MODEL) {
|
||||
settings.env.CLAUDE_MEM_MODEL = CLAUDE_MEM_MODEL;
|
||||
logger.info('WORKER', `Model changed to ${CLAUDE_MEM_MODEL}`, {});
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### What worker-service.ts Does RIGHT ✅
|
||||
|
||||
#### 1. Excellent Error Handling Priority
|
||||
```typescript
|
||||
// Store to SQLite FIRST (source of truth)
|
||||
const { id, createdAtEpoch } = db.storeObservation(...);
|
||||
|
||||
// Broadcast to SSE (real-time UI updates)
|
||||
this.broadcastSSE({ type: 'new_observation', ... });
|
||||
|
||||
// Sync to Chroma ASYNC (fire-and-forget, non-critical)
|
||||
this.chromaSync.syncObservation(...)
|
||||
.catch((error: Error) => {
|
||||
logger.error('...continuing', ...);
|
||||
// Don't crash - SQLite has the data
|
||||
});
|
||||
```
|
||||
|
||||
**Priority**: SQLite > SSE > Chroma
|
||||
**Philosophy**: Write to source of truth first, update UI second, sync to vector DB last. Chroma failures don't crash the worker.
|
||||
|
||||
#### 2. Clean Pagination APIs
|
||||
|
||||
All data endpoints follow consistent pattern:
|
||||
- Parse `offset`, `limit`, `project` from query params
|
||||
- Cap limit at 100 to prevent abuse
|
||||
- Return `{ items, hasMore, total, offset, limit }`
|
||||
- Use parameterized queries (SQL injection safe)
|
||||
|
||||
Example: `handleGetObservations()` (lines 467-515) is textbook good API design.
|
||||
|
||||
#### 3. Proper Async Generator Pattern
|
||||
|
||||
`createMessageGenerator()` (lines 969-1060) is an excellent implementation:
|
||||
- Yields init prompt immediately
|
||||
- Polls message queue with proper abort signal handling
|
||||
- No busy-waiting (100ms sleep between polls)
|
||||
- Clean message type discrimination
|
||||
- Proper error propagation
|
||||
|
||||
#### 4. Defensive SSE Cleanup
|
||||
|
||||
`broadcastSSE()` (lines 250-275):
|
||||
- Early return if no clients (optimization)
|
||||
- Two-phase cleanup (collect failures, then remove)
|
||||
- Doesn't modify Set during iteration
|
||||
- Handles disconnected clients gracefully
|
||||
|
||||
This is GOOD defensive programming, not YAGNI violation.
|
||||
|
||||
---
|
||||
|
||||
## Severity-Ranked YAGNI Violations
|
||||
|
||||
### 🔥🔥🔥🔥🔥 SEVERITY 5: CRITICAL - FIX IMMEDIATELY
|
||||
|
||||
| Issue | File | Lines | Problem | Impact |
|
||||
|-------|------|-------|---------|--------|
|
||||
| Fragile string parsing | worker-utils | 75 | `output.includes("claude-mem-worker") && output.includes("online")` | False positives cause failures |
|
||||
| Session auto-creation duplication | worker-service | 754-844 | 60+ lines copied 3 times | Maintenance nightmare |
|
||||
| Silent PM2 error handling | worker-utils | 70 | Always resolves, defers errors | Confusing error messages |
|
||||
|
||||
### 🔥🔥🔥🔥 SEVERITY 4: MAJOR - FIX SOON
|
||||
|
||||
| Issue | File | Lines | Problem | Impact |
|
||||
|-------|------|-------|---------|--------|
|
||||
| No "running but unhealthy" handling | worker-utils | 77-98 | Skip restart if PM2 says running | Worker never recovers |
|
||||
| Useless getWorkerPort() wrapper | worker-utils | 107-109 | Ceremony for a constant | Code bloat |
|
||||
|
||||
### 🔥🔥🔥 SEVERITY 3: MODERATE - FIX WHEN CONVENIENT
|
||||
|
||||
| Issue | File | Lines | Problem | Impact |
|
||||
|-------|------|-------|---------|--------|
|
||||
| 1500ms debounce too long | worker-service | 316 | Should be 100-200ms | Spinner lags |
|
||||
| Hardcoded model validation | worker-service | 407 | List will get stale | Blocks valid models |
|
||||
| Hardcoded fallback version | worker-service | 343 | '5.0.3' will get stale | Wrong stats |
|
||||
|
||||
### 🔥🔥 SEVERITY 2: MINOR - CLEANUP
|
||||
|
||||
| Issue | File | Lines | Problem | Impact |
|
||||
|-------|------|-------|---------|--------|
|
||||
| Magic numbers everywhere | Both | Multiple | 100, 1000, 1500, etc | Hard to maintain |
|
||||
| Duplicated default configs | worker-service | Multiple | Defaults in many places | Inconsistency risk |
|
||||
| Unnecessary this.port | worker-service | 100 | Should use FIXED_PORT | Confusion |
|
||||
|
||||
---
|
||||
|
||||
## Recommended Action Plan
|
||||
|
||||
### Phase 1: Critical Fixes (Do Today)
|
||||
|
||||
1. **Fix worker-utils.ts completely** - Use the rewrite provided above (43 lines)
|
||||
- Remove getWorkerPort()
|
||||
- Fix PM2 string parsing → use `pm2 restart` (idempotent)
|
||||
- Remove silent error handling
|
||||
- Named constants for all timeouts
|
||||
|
||||
2. **Extract getOrCreateSession()** in worker-service.ts
|
||||
- Remove 60 lines of duplication
|
||||
- Update handleInit, handleObservation, handleSummarize
|
||||
|
||||
### Phase 2: Cleanup (Do This Week)
|
||||
|
||||
3. **Centralize configuration**
|
||||
- Create DEFAULT_CONFIG constant
|
||||
- Remove duplicated defaults
|
||||
- Update all references
|
||||
|
||||
4. **Fix magic numbers**
|
||||
- SPINNER_DEBOUNCE_MS = 200
|
||||
- MESSAGE_POLL_INTERVAL_MS = 100
|
||||
- HEALTH_CHECK_TIMEOUT_MS = 100
|
||||
- etc.
|
||||
|
||||
5. **Remove hardcoded validations**
|
||||
- Model validation (let SDK handle it)
|
||||
- Fallback version (read from package.json)
|
||||
|
||||
### Phase 3: Polish (Do Next Week)
|
||||
|
||||
6. **Fix minor issues**
|
||||
- Remove `this.port` instance variable
|
||||
- Update debounce to 200ms
|
||||
- Add constants for all magic numbers
|
||||
|
||||
---
|
||||
|
||||
## The YAGNI Philosophy Applied
|
||||
|
||||
### What YAGNI Means Here
|
||||
|
||||
**You Aren't Gonna Need It**: Don't build infrastructure for problems you don't have.
|
||||
|
||||
### Examples from This Code
|
||||
|
||||
#### YAGNI Violation ❌
|
||||
```typescript
|
||||
export function getWorkerPort(): number {
|
||||
return FIXED_PORT; // Wrapper for a constant
|
||||
}
|
||||
```
|
||||
**Why**: Adds zero value. Pure ceremony. Just export the constant.
|
||||
|
||||
#### YAGNI Compliance ✅
|
||||
```typescript
|
||||
export const WORKER_PORT = parseInt(...);
|
||||
```
|
||||
**Why**: Solves the actual need (get port) without ceremony.
|
||||
|
||||
---
|
||||
|
||||
#### YAGNI Violation ❌
|
||||
```typescript
|
||||
// Check PM2 status with string parsing
|
||||
const checkProcess = spawn(pm2Path, ["list", "--no-color"]);
|
||||
let output = "";
|
||||
checkProcess.stdout?.on("data", (data) => { output += data.toString(); });
|
||||
// ... 30 lines of promise wrappers and parsing ...
|
||||
const isRunning = output.includes("claude-mem-worker") && output.includes("online");
|
||||
|
||||
if (!isRunning) {
|
||||
// Start worker
|
||||
}
|
||||
// But what if it's running AND unhealthy? Do nothing!
|
||||
```
|
||||
**Why**: Solving a problem that doesn't exist. PM2 restart is idempotent - it handles both "not started" and "started but broken". We don't need to distinguish.
|
||||
|
||||
#### YAGNI Compliance ✅
|
||||
```typescript
|
||||
if (!await isWorkerHealthy()) {
|
||||
execSync(`pm2 restart ecosystem.config.cjs`);
|
||||
await waitForWorkerHealth();
|
||||
}
|
||||
```
|
||||
**Why**: Solves the actual problem (ensure worker is healthy) in the simplest way.
|
||||
|
||||
---
|
||||
|
||||
### The Pattern
|
||||
|
||||
**YAGNI Violations Follow This Pattern**:
|
||||
1. Imagine a scenario ("what if PM2 isn't installed?")
|
||||
2. Write defensive code for the scenario (silent error handling)
|
||||
3. Defer the error to a later point
|
||||
4. Make the actual error message worse
|
||||
|
||||
**YAGNI Compliance Follows This Pattern**:
|
||||
1. Write the obvious solution (check health, restart if unhealthy)
|
||||
2. Let errors propagate naturally
|
||||
3. Add error handling only where actually needed
|
||||
4. Keep error messages clear and direct
|
||||
|
||||
---
|
||||
|
||||
## Conclusion
|
||||
|
||||
### Overall Assessment
|
||||
|
||||
**worker-utils.ts**: 🔥🔥🔥🔥 2/5 - Needs complete rewrite
|
||||
**worker-service.ts**: ✅✅✅✅🔥 4/5 - Mostly excellent, fix duplication
|
||||
|
||||
### The Good
|
||||
|
||||
- worker-service.ts has excellent architecture (SQLite > SSE > Chroma priority)
|
||||
- Clean pagination APIs with proper parameterization
|
||||
- Good async generator pattern for SDK streaming
|
||||
- Proper SSE client management with defensive cleanup
|
||||
- Non-blocking Chroma sync with graceful failures
|
||||
|
||||
### The Bad
|
||||
|
||||
- worker-utils.ts has 3 critical bugs (string parsing, silent errors, missing restart)
|
||||
- 60+ lines of duplicated session auto-creation code
|
||||
- Magic numbers everywhere without named constants
|
||||
- Hardcoded defaults in multiple locations
|
||||
|
||||
### The Ugly
|
||||
|
||||
- `getWorkerPort()` is pure ceremony - delete it
|
||||
- 1500ms debounce is 10x too long
|
||||
- PM2 string parsing is fragile and will break
|
||||
- Silent error handling makes debugging impossible
|
||||
|
||||
### Time to Fix
|
||||
|
||||
- Critical fixes (worker-utils rewrite + extract getOrCreateSession): **2 hours**
|
||||
- Cleanup (centralize config, fix magic numbers): **2 hours**
|
||||
- Polish (minor issues): **1 hour**
|
||||
|
||||
**Total**: 5 hours to bring codebase from 80% to 95% quality.
|
||||
|
||||
### Final Verdict
|
||||
|
||||
This code is **80% excellent, 20% disaster**. The disaster is concentrated in worker-utils.ts (which is called on EVERY session start) and the session auto-creation duplication (which makes maintenance painful). Fix these two issues and you have a rock-solid codebase.
|
||||
|
||||
The worker-service.ts architecture is actually brilliant - the prioritization of SQLite > SSE > Chroma is exactly right, and the async generator pattern for SDK streaming is textbook perfect. Don't let the duplication overshadow the good design.
|
||||
|
||||
**Recommendation**: Fix worker-utils.ts TODAY (it has production bugs), extract getOrCreateSession() THIS WEEK (it's painful to maintain), and clean up the rest NEXT WEEK.
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as I}from"process";import M from"better-sqlite3";import{join as E,dirname as y,basename as F}from"path";import{homedir as O}from"os";import{existsSync as H,mkdirSync as k}from"fs";import{fileURLToPath as x}from"url";function U(){return typeof __dirname<"u"?__dirname:y(x(import.meta.url))}var P=U(),u=process.env.CLAUDE_MEM_DATA_DIR||E(O(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||E(O(),".claude"),W=E(u,"archives"),Y=E(u,"logs"),K=E(u,"trash"),V=E(u,"backups"),q=E(u,"settings.json"),f=E(u,"claude-mem.db"),J=E(u,"vector-db"),Q=E(R,"settings.json"),z=E(R,"commands"),Z=E(R,"CLAUDE.md");function L(c){k(c,{recursive:!0})}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let l="";n!=null&&(this.level===0&&typeof n=="object"?l=`
|
||||
`+JSON.stringify(n,null,2):l=" "+this.formatData(n));let T="";if(r){let{sessionId:m,sdkSessionId:b,correlationId:p,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([v,D])=>`${v}=${D}`).join(", ")}}`)}let S=`[${o}] [${i}] [${d}] ${_}${t}${T}${l}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new N;var g=class{db;constructor(){L(u),this.db=new M(f),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as I}from"process";import w from"better-sqlite3";import{join as E,dirname as k,basename as W}from"path";import{homedir as O}from"os";import{existsSync as K,mkdirSync as x}from"fs";import{fileURLToPath as U}from"url";function M(){return typeof __dirname<"u"?__dirname:k(U(import.meta.url))}var q=M(),u=process.env.CLAUDE_MEM_DATA_DIR||E(O(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||E(O(),".claude"),J=E(u,"archives"),Q=E(u,"logs"),z=E(u,"trash"),Z=E(u,"backups"),ee=E(u,"settings.json"),f=E(u,"claude-mem.db"),se=E(u,"vector-db"),te=E(R,"settings.json"),re=E(R,"commands"),ne=E(R,"CLAUDE.md");function L(c){x(c,{recursive:!0})}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let m="";n!=null&&(this.level===0&&typeof n=="object"?m=`
|
||||
`+JSON.stringify(n,null,2):m=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:p,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([y,D])=>`${y}=${D}`).join(", ")}}`)}let b=`[${o}] [${i}] [${d}] ${_}${t}${T}${m}`;e===3?console.error(b):console.log(b)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new N;var g=class{db;constructor(){L(u),this.db=new w(f),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -317,23 +317,23 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let l=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let m=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(l.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(m.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let l=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let m=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(l.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(m.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -360,31 +360,31 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,_;if(e!==null){let m=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,_;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${o}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,b=`
|
||||
`,S=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let p=this.db.prepare(m).all(e,...i,t+1),a=this.db.prepare(b).all(e,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary observations:",p.message),{observations:[],sessions:[],prompts:[]}}}else{let m=`
|
||||
`;try{let p=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary observations:",p.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,b=`
|
||||
`,S=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let p=this.db.prepare(m).all(s,...i,t),a=this.db.prepare(b).all(s,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary timestamps:",p.message),{observations:[],sessions:[],prompts:[]}}}let l=`
|
||||
`;try{let p=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary timestamps:",p.message),{observations:[],sessions:[],prompts:[]}}}let m=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
@@ -394,11 +394,11 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,S=`
|
||||
`,b=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let m=this.db.prepare(l).all(d,_,...i),b=this.db.prepare(T).all(d,_,...i),p=this.db.prepare(S).all(d,_,...i);return{observations:m,sessions:b.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:p.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(m){return console.error("[SessionStore] Error querying timeline records:",m.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};async function C(c){console.error("[claude-mem cleanup] Hook fired",{input:c?{session_id:c.session_id,cwd:c.cwd,reason:c.reason}:null}),c||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=c;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s});let t=new g,r=t.findActiveSDKSession(e);r||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),t.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:r.id,sdk_session_id:r.sdk_session_id,project:r.project,worker_port:r.worker_port}),t.markSessionCompleted(r.id),console.error("[claude-mem cleanup] Session marked as completed in database"),t.close();try{let n=r.worker_port||37777;await fetch(`http://127.0.0.1:${n}/sessions/${r.id}/complete`,{method:"POST",signal:AbortSignal.timeout(1e3)}),console.error("[claude-mem cleanup] Worker notified to stop processing indicator")}catch(n){console.error("[claude-mem cleanup] Failed to notify worker (non-critical):",n)}console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(I.isTTY)C(void 0);else{let c="";I.on("data",e=>c+=e),I.on("end",async()=>{let e=c?JSON.parse(c):void 0;await C(e)})}
|
||||
`;try{let l=this.db.prepare(m).all(d,_,...i),S=this.db.prepare(T).all(d,_,...i),p=this.db.prepare(b).all(d,_,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:p.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};import X from"path";import{homedir as F}from"os";import{existsSync as B,readFileSync as H}from"fs";function C(){try{let c=X.join(F(),".claude-mem","settings.json");if(B(c)){let e=JSON.parse(H(c,"utf-8")),s=parseInt(e.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(s))return s}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}async function v(c){console.error("[claude-mem cleanup] Hook fired",{input:c?{session_id:c.session_id,cwd:c.cwd,reason:c.reason}:null}),c||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=c;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s});let t=new g,r=t.findActiveSDKSession(e);r||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),t.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:r.id,sdk_session_id:r.sdk_session_id,project:r.project,worker_port:r.worker_port}),t.markSessionCompleted(r.id),console.error("[claude-mem cleanup] Session marked as completed in database"),t.close();try{let n=r.worker_port||C();await fetch(`http://127.0.0.1:${n}/sessions/${r.id}/complete`,{method:"POST",signal:AbortSignal.timeout(1e3)}),console.error("[claude-mem cleanup] Worker notified to stop processing indicator")}catch(n){console.error("[claude-mem cleanup] Failed to notify worker (non-critical):",n)}console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(I.isTTY)v(void 0);else{let c="";I.on("data",e=>c+=e),I.on("end",async()=>{let e=c?JSON.parse(c):void 0;await v(e)})}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import F from"path";import{stdin as M}from"process";import ae from"better-sqlite3";import{join as S,dirname as te,basename as be}from"path";import{homedir as B}from"os";import{existsSync as Ne,mkdirSync as re}from"fs";import{fileURLToPath as ne}from"url";function oe(){return typeof __dirname<"u"?__dirname:te(ne(import.meta.url))}var ie=oe(),I=process.env.CLAUDE_MEM_DATA_DIR||S(B(),".claude-mem"),$=process.env.CLAUDE_CONFIG_DIR||S(B(),".claude"),Ie=S(I,"archives"),Le=S(I,"logs"),ye=S(I,"trash"),ve=S(I,"backups"),Ae=S(I,"settings.json"),j=S(I,"claude-mem.db"),Ce=S(I,"vector-db"),De=S($,"settings.json"),xe=S($,"commands"),ke=S($,"CLAUDE.md");function W(d){re(d,{recursive:!0})}function H(){return S(ie,"..","..")}var U=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(U||{}),w=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=U[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let c=new Date().toISOString().replace("T"," ").substring(0,23),a=U[e].padEnd(5),u=s.padEnd(6),m="";r?.correlationId?m=`[${r.correlationId}] `:r?.sessionId&&(m=`[session-${r.sessionId}] `);let E="";o!=null&&(this.level===0&&typeof o=="object"?E=`
|
||||
`+JSON.stringify(o,null,2):E=" "+this.formatData(o));let n="";if(r){let{sessionId:f,sdkSessionId:N,correlationId:l,...p}=r;Object.keys(p).length>0&&(n=` {${Object.entries(p).map(([_,h])=>`${_}=${h}`).join(", ")}}`)}let y=`[${c}] [${a}] [${u}] ${m}${t}${n}${E}`;e===3?console.error(y):console.log(y)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},G=new w;var D=class{db;constructor(){W(I),this.db=new ae(j),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import X from"path";import{stdin as F}from"process";import ie from"better-sqlite3";import{join as b,dirname as se,basename as Ie}from"path";import{homedir as j}from"os";import{existsSync as ve,mkdirSync as te}from"fs";import{fileURLToPath as re}from"url";function ne(){return typeof __dirname<"u"?__dirname:se(re(import.meta.url))}var oe=ne(),I=process.env.CLAUDE_MEM_DATA_DIR||b(j(),".claude-mem"),$=process.env.CLAUDE_CONFIG_DIR||b(j(),".claude"),De=b(I,"archives"),xe=b(I,"logs"),ke=b(I,"trash"),$e=b(I,"backups"),Ue=b(I,"settings.json"),H=b(I,"claude-mem.db"),Me=b(I,"vector-db"),we=b($,"settings.json"),Fe=b($,"commands"),Xe=b($,"CLAUDE.md");function W(a){te(a,{recursive:!0})}function G(){return b(oe,"..","..")}var U=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(U||{}),M=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=U[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let c=new Date().toISOString().replace("T"," ").substring(0,23),d=U[e].padEnd(5),_=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let S="";o!=null&&(this.level===0&&typeof o=="object"?S=`
|
||||
`+JSON.stringify(o,null,2):S=" "+this.formatData(o));let n="";if(r){let{sessionId:f,sdkSessionId:N,correlationId:m,...p}=r;Object.keys(p).length>0&&(n=` {${Object.entries(p).map(([u,T])=>`${u}=${T}`).join(", ")}}`)}let y=`[${c}] [${d}] [${_}] ${E}${t}${n}${S}`;e===3?console.error(y):console.log(y)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},Y=new M;var D=class{db;constructor(){W(I),this.db=new ie(H),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(u=>u.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(u=>u.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(u=>u.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(_=>_.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(_=>_.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(_=>_.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -243,10 +243,10 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",d=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${a})
|
||||
WHERE id IN (${d})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${c}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
@@ -261,7 +261,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,o=new Set;for(let c of t){if(c.files_read)try{let a=JSON.parse(c.files_read);Array.isArray(a)&&a.forEach(u=>r.add(u))}catch{}if(c.files_modified)try{let a=JSON.parse(c.files_modified);Array.isArray(a)&&a.forEach(u=>o.add(u))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,o=new Set;for(let c of t){if(c.files_read)try{let d=JSON.parse(c.files_read);Array.isArray(d)&&d.forEach(_=>r.add(_))}catch{}if(c.files_modified)try{let d=JSON.parse(c.files_modified);Array.isArray(d)&&d.forEach(_=>o.add(_))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -288,17 +288,17 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,o=r.getTime(),a=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,o=r.getTime(),d=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,t,r.toISOString(),o);return a.lastInsertRowid===0||a.changes===0?this.db.prepare(`
|
||||
`).run(e,e,s,t,r.toISOString(),o);return d.lastInsertRowid===0||d.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:a.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
`).get(e).id:d.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(G.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(Y.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -317,23 +317,23 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let S=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),c);return{id:Number(E.lastInsertRowid),createdAtEpoch:c}}storeSummary(e,s,t,r){let o=new Date,c=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),c);return{id:Number(S.lastInsertRowid),createdAtEpoch:c}}storeSummary(e,s,t,r){let o=new Date,c=o.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let S=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),c);return{id:Number(E.lastInsertRowid),createdAtEpoch:c}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),c);return{id:Number(S.lastInsertRowid),createdAtEpoch:c}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -345,22 +345,22 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",d=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
WHERE id IN (${a})
|
||||
WHERE id IN (${d})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${c}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",c=r?`LIMIT ${r}`:"",d=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.id IN (${a})
|
||||
WHERE up.id IN (${d})
|
||||
ORDER BY up.created_at_epoch ${o}
|
||||
${c}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let c=o?"AND project = ?":"",a=o?[o]:[],u,m;if(e!==null){let f=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let c=o?"AND project = ?":"",d=o?[o]:[],_,E;if(e!==null){let f=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${c}
|
||||
@@ -372,7 +372,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE id >= ? ${c}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let l=this.db.prepare(f).all(e,...a,t+1),p=this.db.prepare(N).all(e,...a,r+1);if(l.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};u=l.length>0?l[l.length-1].created_at_epoch:s,m=p.length>0?p[p.length-1].created_at_epoch:s}catch(l){return console.error("[SessionStore] Error getting boundary observations:",l.message),{observations:[],sessions:[],prompts:[]}}}else{let f=`
|
||||
`;try{let m=this.db.prepare(f).all(e,...d,t+1),p=this.db.prepare(N).all(e,...d,r+1);if(m.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};_=m.length>0?m[m.length-1].created_at_epoch:s,E=p.length>0?p[p.length-1].created_at_epoch:s}catch(m){return console.error("[SessionStore] Error getting boundary observations:",m.message),{observations:[],sessions:[],prompts:[]}}}else{let f=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${c}
|
||||
@@ -384,7 +384,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE created_at_epoch >= ? ${c}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let l=this.db.prepare(f).all(s,...a,t),p=this.db.prepare(N).all(s,...a,r+1);if(l.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};u=l.length>0?l[l.length-1].created_at_epoch:s,m=p.length>0?p[p.length-1].created_at_epoch:s}catch(l){return console.error("[SessionStore] Error getting boundary timestamps:",l.message),{observations:[],sessions:[],prompts:[]}}}let E=`
|
||||
`;try{let m=this.db.prepare(f).all(s,...d,t),p=this.db.prepare(N).all(s,...d,r+1);if(m.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};_=m.length>0?m[m.length-1].created_at_epoch:s,E=p.length>0?p[p.length-1].created_at_epoch:s}catch(m){return console.error("[SessionStore] Error getting boundary timestamps:",m.message),{observations:[],sessions:[],prompts:[]}}}let S=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${c}
|
||||
@@ -400,7 +400,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${c.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let f=this.db.prepare(E).all(u,m,...a),N=this.db.prepare(n).all(u,m,...a),l=this.db.prepare(y).all(u,m,...a);return{observations:f,sessions:N.map(p=>({id:p.id,sdk_session_id:p.sdk_session_id,project:p.project,request:p.request,completed:p.completed,next_steps:p.next_steps,created_at:p.created_at,created_at_epoch:p.created_at_epoch})),prompts:l.map(p=>({id:p.id,claude_session_id:p.claude_session_id,project:p.project,prompt:p.prompt_text,created_at:p.created_at,created_at_epoch:p.created_at_epoch}))}}catch(f){return console.error("[SessionStore] Error querying timeline records:",f.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};import Y from"path";import{spawn as V}from"child_process";var de=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);async function q(d=100){try{return(await fetch(`http://127.0.0.1:${de}/health`,{signal:AbortSignal.timeout(d)})).ok}catch{return!1}}async function ce(d=1e4){let e=Date.now(),s=100;for(;Date.now()-e<d;){if(await q(1e3))return!0;await new Promise(t=>setTimeout(t,s))}return!1}async function K(){if(await q())return;let d=H(),e=Y.join(d,"node_modules",".bin","pm2"),s=Y.join(d,"ecosystem.config.cjs"),t=V(e,["list","--no-color"],{cwd:d,stdio:["ignore","pipe","ignore"]}),r="";if(t.stdout?.on("data",a=>{r+=a.toString()}),await new Promise((a,u)=>{t.on("error",m=>u(m)),t.on("close",m=>{a()})}),!(r.includes("claude-mem-worker")&&r.includes("online"))){let a=V(e,["start",s],{cwd:d,stdio:"ignore"});await new Promise((u,m)=>{a.on("error",E=>m(E)),a.on("close",E=>{E!==0&&E!==null?m(new Error(`PM2 start command failed with exit code ${E}`)):u()})})}if(!await ce(1e4))throw new Error("Worker failed to become healthy after starting")}var pe=parseInt(process.env.CLAUDE_MEM_CONTEXT_OBSERVATIONS||"50",10),J=10,i={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function ue(d){if(!d)return[];let e=JSON.parse(d);return Array.isArray(e)?e:[]}function _e(d){return new Date(d).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function le(d){return new Date(d).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function me(d){return new Date(d).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function Ee(d){return d?Math.ceil(d.length/4):0}function Te(d,e){return F.isAbsolute(d)?F.relative(e,d):d}async function Q(d,e=!1,s=!1){await K();let t=d?.cwd??process.cwd(),r=t?F.basename(t):"unknown-project",o=new D,c=o.db.prepare(`
|
||||
`;try{let f=this.db.prepare(S).all(_,E,...d),N=this.db.prepare(n).all(_,E,...d),m=this.db.prepare(y).all(_,E,...d);return{observations:f,sessions:N.map(p=>({id:p.id,sdk_session_id:p.sdk_session_id,project:p.project,request:p.request,completed:p.completed,next_steps:p.next_steps,created_at:p.created_at,created_at_epoch:p.created_at_epoch})),prompts:m.map(p=>({id:p.id,claude_session_id:p.claude_session_id,project:p.project,prompt:p.prompt_text,created_at:p.created_at,created_at_epoch:p.created_at_epoch}))}}catch(f){return console.error("[SessionStore] Error querying timeline records:",f.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};import w from"path";import{homedir as ae}from"os";import{existsSync as de,readFileSync as ce}from"fs";import{execSync as pe}from"child_process";var _e=100,ue=100,me=1e4;function le(){try{let a=w.join(ae(),".claude-mem","settings.json");if(de(a)){let e=JSON.parse(ce(a,"utf-8")),s=parseInt(e.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(s))return s}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}async function K(){try{let a=le();return(await fetch(`http://127.0.0.1:${a}/health`,{signal:AbortSignal.timeout(_e)})).ok}catch{return!1}}async function Ee(){let a=Date.now();for(;Date.now()-a<me;){if(await K())return!0;await new Promise(e=>setTimeout(e,ue))}return!1}async function V(){if(await K())return;let a=G(),e=w.join(a,"node_modules",".bin","pm2"),s=w.join(a,"ecosystem.config.cjs");if(pe(`"${e}" restart "${s}"`,{cwd:a,stdio:"pipe"}),!await Ee())throw new Error("Worker failed to become healthy after restart")}var Te=parseInt(process.env.CLAUDE_MEM_CONTEXT_OBSERVATIONS||"50",10),q=10,i={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function he(a){if(!a)return[];let e=JSON.parse(a);return Array.isArray(e)?e:[]}function ge(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function be(a){return new Date(a).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function Se(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function fe(a){return a?Math.ceil(a.length/4):0}function Re(a,e){return X.isAbsolute(a)?X.relative(e,a):a}async function J(a,e=!1,s=!1){await V();let t=a?.cwd??process.cwd(),r=t?X.basename(t):"unknown-project",o=new D,c=o.db.prepare(`
|
||||
SELECT
|
||||
id, sdk_session_id, type, title, subtitle, narrative,
|
||||
facts, concepts, files_read, files_modified,
|
||||
@@ -409,18 +409,18 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(r,pe),a=o.db.prepare(`
|
||||
`).all(r,Te),d=o.db.prepare(`
|
||||
SELECT id, sdk_session_id, request, completed, next_steps, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(r,J+1);if(c.length===0&&a.length===0)return o.close(),e?`
|
||||
`).all(r,q+1);if(c.length===0&&d.length===0)return o.close(),e?`
|
||||
${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}
|
||||
${i.gray}${"\u2500".repeat(60)}${i.reset}
|
||||
|
||||
${i.dim}No previous sessions found for this project yet.${i.reset}
|
||||
`:`# [${r}] recent context
|
||||
|
||||
No previous sessions found for this project yet.`;let u=c,m=a.slice(0,J),E=u,n=[];if(e?(n.push(""),n.push(`${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}`),n.push(`${i.gray}${"\u2500".repeat(60)}${i.reset}`),n.push("")):(n.push(`# [${r}] recent context`),n.push("")),E.length>0){e?(n.push(`${i.dim}Legend: \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision${i.reset}`),n.push("")):(n.push("**Legend:** \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision"),n.push("")),e?(n.push(`${i.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${i.reset}`),n.push(`${i.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${i.reset}`),n.push(`${i.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${i.reset}`),n.push(`${i.dim} \u2192 Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately${i.reset}`),n.push("")):(n.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),n.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),n.push("- Prefer searching observations over re-reading code for past decisions and learnings"),n.push("- Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately"),n.push(""));let y=a[0]?.id,f=m.map((_,h)=>{let T=h===0?null:a[h+1];return{..._,displayEpoch:T?T.created_at_epoch:_.created_at_epoch,displayTime:T?T.created_at:_.created_at,isMostRecent:_.id===y}}),N=[...E.map(_=>({type:"observation",data:_})),...f.map(_=>({type:"summary",data:_}))];N.sort((_,h)=>{let T=_.type==="observation"?_.data.created_at_epoch:_.data.displayEpoch,L=h.type==="observation"?h.data.created_at_epoch:h.data.displayEpoch;return T-L});let l=new Map;for(let _ of N){let h=_.type==="observation"?_.data.created_at:_.data.displayTime,T=me(h);l.has(T)||l.set(T,[]),l.get(T).push(_)}let p=Array.from(l.entries()).sort((_,h)=>{let T=new Date(_[0]).getTime(),L=new Date(h[0]).getTime();return T-L});for(let[_,h]of p){e?(n.push(`${i.bright}${i.cyan}${_}${i.reset}`),n.push("")):(n.push(`### ${_}`),n.push(""));let T=null,L="",v=!1;for(let x of h)if(x.type==="summary"){v&&(n.push(""),v=!1,T=null,L="");let g=x.data,A=`${g.request||"Session started"} (${_e(g.displayTime)})`,O=g.isMostRecent?"":`claude-mem://session-summary/${g.id}`;if(e){let b=O?`${i.dim}[${O}]${i.reset}`:"";n.push(`\u{1F3AF} ${i.yellow}#S${g.id}${i.reset} ${A} ${b}`)}else{let b=O?` [\u2192](${O})`:"";n.push(`**\u{1F3AF} #S${g.id}** ${A}${b}`)}n.push("")}else{let g=x.data,A=ue(g.files_modified),O=A.length>0?Te(A[0],t):"General";O!==T&&(v&&n.push(""),e?n.push(`${i.dim}${O}${i.reset}`):n.push(`**${O}**`),e||(n.push("| ID | Time | T | Title | Tokens |"),n.push("|----|------|---|-------|--------|")),T=O,v=!0,L="");let b="\u2022";switch(g.type){case"bugfix":b="\u{1F534}";break;case"feature":b="\u{1F7E3}";break;case"refactor":b="\u{1F504}";break;case"change":b="\u2705";break;case"discovery":b="\u{1F535}";break;case"decision":b="\u{1F9E0}";break;default:b="\u2022"}let C=le(g.created_at),X=g.title||"Untitled",k=Ee(g.narrative),P=C!==L,Z=P?C:"";if(L=C,e){let ee=P?`${i.dim}${C}${i.reset}`:" ".repeat(C.length),se=k>0?`${i.dim}(~${k}t)${i.reset}`:"";n.push(` ${i.dim}#${g.id}${i.reset} ${ee} ${b} ${X} ${se}`)}else n.push(`| #${g.id} | ${Z||"\u2033"} | ${b} | ${X} | ~${k} |`)}v&&n.push("")}let R=a[0];R&&(R.completed||R.next_steps)&&(R.completed&&(e?n.push(`${i.green}Completed:${i.reset} ${R.completed}`):n.push(`**Completed**: ${R.completed}`),n.push("")),R.next_steps&&(e?n.push(`${i.magenta}Next Steps:${i.reset} ${R.next_steps}`):n.push(`**Next Steps**: ${R.next_steps}`),n.push(""))),e?n.push(`${i.dim}Use claude-mem MCP search to access records with the given ID${i.reset}`):n.push("*Use claude-mem MCP search to access records with the given ID*")}return o.close(),n.join(`
|
||||
`).trimEnd()}var z=process.argv.includes("--index"),he=process.argv.includes("--colors");if(M.isTTY||he)Q(void 0,!0,z).then(d=>{console.log(d),process.exit(0)});else{let d="";M.on("data",e=>d+=e),M.on("end",async()=>{let e=d.trim()?JSON.parse(d):void 0,t={hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:await Q(e,!1,z)}};console.log(JSON.stringify(t)),process.exit(0)})}
|
||||
No previous sessions found for this project yet.`;let _=c,E=d.slice(0,q),S=_,n=[];if(e?(n.push(""),n.push(`${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}`),n.push(`${i.gray}${"\u2500".repeat(60)}${i.reset}`),n.push("")):(n.push(`# [${r}] recent context`),n.push("")),S.length>0){e?(n.push(`${i.dim}Legend: \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision${i.reset}`),n.push("")):(n.push("**Legend:** \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision"),n.push("")),e?(n.push(`${i.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${i.reset}`),n.push(`${i.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${i.reset}`),n.push(`${i.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${i.reset}`),n.push(`${i.dim} \u2192 Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately${i.reset}`),n.push("")):(n.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),n.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),n.push("- Prefer searching observations over re-reading code for past decisions and learnings"),n.push("- Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately"),n.push(""));let y=d[0]?.id,f=E.map((u,T)=>{let l=T===0?null:d[T+1];return{...u,displayEpoch:l?l.created_at_epoch:u.created_at_epoch,displayTime:l?l.created_at:u.created_at,isMostRecent:u.id===y}}),N=[...S.map(u=>({type:"observation",data:u})),...f.map(u=>({type:"summary",data:u}))];N.sort((u,T)=>{let l=u.type==="observation"?u.data.created_at_epoch:u.data.displayEpoch,L=T.type==="observation"?T.data.created_at_epoch:T.data.displayEpoch;return l-L});let m=new Map;for(let u of N){let T=u.type==="observation"?u.data.created_at:u.data.displayTime,l=Se(T);m.has(l)||m.set(l,[]),m.get(l).push(u)}let p=Array.from(m.entries()).sort((u,T)=>{let l=new Date(u[0]).getTime(),L=new Date(T[0]).getTime();return l-L});for(let[u,T]of p){e?(n.push(`${i.bright}${i.cyan}${u}${i.reset}`),n.push("")):(n.push(`### ${u}`),n.push(""));let l=null,L="",A=!1;for(let x of T)if(x.type==="summary"){A&&(n.push(""),A=!1,l=null,L="");let h=x.data,v=`${h.request||"Session started"} (${ge(h.displayTime)})`,O=h.isMostRecent?"":`claude-mem://session-summary/${h.id}`;if(e){let g=O?`${i.dim}[${O}]${i.reset}`:"";n.push(`\u{1F3AF} ${i.yellow}#S${h.id}${i.reset} ${v} ${g}`)}else{let g=O?` [\u2192](${O})`:"";n.push(`**\u{1F3AF} #S${h.id}** ${v}${g}`)}n.push("")}else{let h=x.data,v=he(h.files_modified),O=v.length>0?Re(v[0],t):"General";O!==l&&(A&&n.push(""),e?n.push(`${i.dim}${O}${i.reset}`):n.push(`**${O}**`),e||(n.push("| ID | Time | T | Title | Tokens |"),n.push("|----|------|---|-------|--------|")),l=O,A=!0,L="");let g="\u2022";switch(h.type){case"bugfix":g="\u{1F534}";break;case"feature":g="\u{1F7E3}";break;case"refactor":g="\u{1F504}";break;case"change":g="\u2705";break;case"discovery":g="\u{1F535}";break;case"decision":g="\u{1F9E0}";break;default:g="\u2022"}let C=be(h.created_at),B=h.title||"Untitled",k=fe(h.narrative),P=C!==L,z=P?C:"";if(L=C,e){let Z=P?`${i.dim}${C}${i.reset}`:" ".repeat(C.length),ee=k>0?`${i.dim}(~${k}t)${i.reset}`:"";n.push(` ${i.dim}#${h.id}${i.reset} ${Z} ${g} ${B} ${ee}`)}else n.push(`| #${h.id} | ${z||"\u2033"} | ${g} | ${B} | ~${k} |`)}A&&n.push("")}let R=d[0];R&&(R.completed||R.next_steps)&&(R.completed&&(e?n.push(`${i.green}Completed:${i.reset} ${R.completed}`):n.push(`**Completed**: ${R.completed}`),n.push("")),R.next_steps&&(e?n.push(`${i.magenta}Next Steps:${i.reset} ${R.next_steps}`):n.push(`**Next Steps**: ${R.next_steps}`),n.push(""))),e?n.push(`${i.dim}Use claude-mem MCP search to access records with the given ID${i.reset}`):n.push("*Use claude-mem MCP search to access records with the given ID*")}return o.close(),n.join(`
|
||||
`).trimEnd()}var Q=process.argv.includes("--index"),Ne=process.argv.includes("--colors");if(F.isTTY||Ne)J(void 0,!0,Q).then(a=>{console.log(a),process.exit(0)});else{let a="";F.on("data",e=>a+=e),F.on("end",async()=>{let e=a.trim()?JSON.parse(a):void 0,t={hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:await J(e,!1,Q)}};console.log(JSON.stringify(t)),process.exit(0)})}
|
||||
|
||||
+34
-34
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import Y from"path";import{stdin as U}from"process";import j from"better-sqlite3";import{join as m,dirname as X,basename as J}from"path";import{homedir as I}from"os";import{existsSync as ee,mkdirSync as F}from"fs";import{fileURLToPath as P}from"url";function B(){return typeof __dirname<"u"?__dirname:X(P(import.meta.url))}var H=B(),E=process.env.CLAUDE_MEM_DATA_DIR||m(I(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||m(I(),".claude"),te=m(E,"archives"),re=m(E,"logs"),ne=m(E,"trash"),oe=m(E,"backups"),ie=m(E,"settings.json"),f=m(E,"claude-mem.db"),ae=m(E,"vector-db"),de=m(R,"settings.json"),pe=m(R,"commands"),ce=m(R,"CLAUDE.md");function L(p){F(p,{recursive:!0})}function A(){return m(H,"..","..")}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),o=h[e].padEnd(5),a=s.padEnd(6),c="";r?.correlationId?c=`[${r.correlationId}] `:r?.sessionId&&(c=`[session-${r.sessionId}] `);let _="";n!=null&&(this.level===0&&typeof n=="object"?_=`
|
||||
`+JSON.stringify(n,null,2):_=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:b,correlationId:u,...d}=r;Object.keys(d).length>0&&(T=` {${Object.entries(d).map(([w,M])=>`${w}=${M}`).join(", ")}}`)}let S=`[${i}] [${o}] [${a}] ${c}${t}${T}${_}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},C=new N;var g=class{db;constructor(){L(E),this.db=new j(f),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import z from"path";import{stdin as U}from"process";import j from"better-sqlite3";import{join as u,dirname as X,basename as te}from"path";import{homedir as L}from"os";import{existsSync as ie,mkdirSync as F}from"fs";import{fileURLToPath as P}from"url";function H(){return typeof __dirname<"u"?__dirname:X(P(import.meta.url))}var B=H(),m=process.env.CLAUDE_MEM_DATA_DIR||u(L(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||u(L(),".claude"),pe=u(m,"archives"),de=u(m,"logs"),ce=u(m,"trash"),_e=u(m,"backups"),ue=u(m,"settings.json"),A=u(m,"claude-mem.db"),Ee=u(m,"vector-db"),me=u(R,"settings.json"),le=u(R,"commands"),Te=u(R,"CLAUDE.md");function C(a){F(a,{recursive:!0})}function v(){return u(B,"..","..")}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),c="";r?.correlationId?c=`[${r.correlationId}] `:r?.sessionId&&(c=`[session-${r.sessionId}] `);let E="";n!=null&&(this.level===0&&typeof n=="object"?E=`
|
||||
`+JSON.stringify(n,null,2):E=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:b,correlationId:_,...p}=r;Object.keys(p).length>0&&(T=` {${Object.entries(p).map(([M,w])=>`${M}=${w}`).join(", ")}}`)}let S=`[${o}] [${i}] [${d}] ${c}${t}${T}${E}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},y=new N;var g=class{db;constructor(){C(m),this.db=new j(A),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(a=>a.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(a=>a.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(a=>a.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(d=>d.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -243,12 +243,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",o=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${o})
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${i}
|
||||
${o}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
@@ -261,7 +261,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,n=new Set;for(let i of t){if(i.files_read)try{let o=JSON.parse(i.files_read);Array.isArray(o)&&o.forEach(a=>r.add(a))}catch{}if(i.files_modified)try{let o=JSON.parse(i.files_modified);Array.isArray(o)&&o.forEach(a=>n.add(a))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,n=new Set;for(let o of t){if(o.files_read)try{let i=JSON.parse(o.files_read);Array.isArray(i)&&i.forEach(d=>r.add(d))}catch{}if(o.files_modified)try{let i=JSON.parse(o.files_modified);Array.isArray(i)&&i.forEach(d=>n.add(d))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -288,17 +288,17 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,n=r.getTime(),o=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,n=r.getTime(),i=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,t,r.toISOString(),n);return o.lastInsertRowid===0||o.changes===0?this.db.prepare(`
|
||||
`).run(e,e,s,t,r.toISOString(),n);return i.lastInsertRowid===0||i.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:o.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
`).get(e).id:i.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(C.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(y.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -311,29 +311,29 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`).run(e,s,t,r.toISOString(),n).lastInsertRowid}storeObservation(e,s,t,r){let n=new Date,i=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t,r.toISOString(),n).lastInsertRowid}storeObservation(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),i),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let _=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),i);return{id:Number(_.lastInsertRowid),createdAtEpoch:i}}storeSummary(e,s,t,r){let n=new Date,i=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(E.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),i),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let _=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),i);return{id:Number(_.lastInsertRowid),createdAtEpoch:i}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(E.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -345,59 +345,59 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",o=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
WHERE id IN (${o})
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${i}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",o=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
${o}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.id IN (${o})
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${i}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let i=n?"AND project = ?":"",o=n?[n]:[],a,c;if(e!==null){let l=`
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,c;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${i}
|
||||
WHERE id <= ? ${o}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,b=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${i}
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let u=this.db.prepare(l).all(e,...o,t+1),d=this.db.prepare(b).all(e,...o,r+1);if(u.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};a=u.length>0?u[u.length-1].created_at_epoch:s,c=d.length>0?d[d.length-1].created_at_epoch:s}catch(u){return console.error("[SessionStore] Error getting boundary observations:",u.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
`;try{let _=this.db.prepare(l).all(e,...i,t+1),p=this.db.prepare(b).all(e,...i,r+1);if(_.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,c=p.length>0?p[p.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary observations:",_.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${i}
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,b=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${i}
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let u=this.db.prepare(l).all(s,...o,t),d=this.db.prepare(b).all(s,...o,r+1);if(u.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};a=u.length>0?u[u.length-1].created_at_epoch:s,c=d.length>0?d[d.length-1].created_at_epoch:s}catch(u){return console.error("[SessionStore] Error getting boundary timestamps:",u.message),{observations:[],sessions:[],prompts:[]}}}let _=`
|
||||
`;try{let _=this.db.prepare(l).all(s,...i,t),p=this.db.prepare(b).all(s,...i,r+1);if(_.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,c=p.length>0?p[p.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary timestamps:",_.message),{observations:[],sessions:[],prompts:[]}}}let E=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${i}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,T=`
|
||||
SELECT *
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${i}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,S=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${i.replace("project","s.project")}
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let l=this.db.prepare(_).all(a,c,...o),b=this.db.prepare(T).all(a,c,...o),u=this.db.prepare(S).all(a,c,...o);return{observations:l,sessions:b.map(d=>({id:d.id,sdk_session_id:d.sdk_session_id,project:d.project,request:d.request,completed:d.completed,next_steps:d.next_steps,created_at:d.created_at,created_at_epoch:d.created_at_epoch})),prompts:u.map(d=>({id:d.id,claude_session_id:d.claude_session_id,project:d.project,prompt:d.prompt_text,created_at:d.created_at,created_at_epoch:d.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(p,e,s){return p==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:p==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:p==="UserPromptSubmit"||p==="PostToolUse"?{continue:!0,suppressOutput:!0}:p==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(p,e,s={}){let t=$(p,e,s);return JSON.stringify(t)}import y from"path";import{spawn as D}from"child_process";var W=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);async function k(p=100){try{return(await fetch(`http://127.0.0.1:${W}/health`,{signal:AbortSignal.timeout(p)})).ok}catch{return!1}}async function G(p=1e4){let e=Date.now(),s=100;for(;Date.now()-e<p;){if(await k(1e3))return!0;await new Promise(t=>setTimeout(t,s))}return!1}async function x(){if(await k())return;let p=A(),e=y.join(p,"node_modules",".bin","pm2"),s=y.join(p,"ecosystem.config.cjs"),t=D(e,["list","--no-color"],{cwd:p,stdio:["ignore","pipe","ignore"]}),r="";if(t.stdout?.on("data",o=>{r+=o.toString()}),await new Promise((o,a)=>{t.on("error",c=>a(c)),t.on("close",c=>{o()})}),!(r.includes("claude-mem-worker")&&r.includes("online"))){let o=D(e,["start",s],{cwd:p,stdio:"ignore"});await new Promise((a,c)=>{o.on("error",_=>c(_)),o.on("close",_=>{_!==0&&_!==null?c(new Error(`PM2 start command failed with exit code ${_}`)):a()})})}if(!await G(1e4))throw new Error("Worker failed to become healthy after starting")}async function K(p){if(!p)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=p,r=Y.basename(s);await x();let n=new g,i=n.createSDKSession(e,r,t),o=n.incrementPromptCounter(i);n.saveUserPrompt(e,o,t),console.error(`[new-hook] Session ${i}, prompt #${o}`),n.close();let a=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);try{let c=await fetch(`http://127.0.0.1:${a}/sessions/${i}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!c.ok){let _=await c.text();throw new Error(`Failed to initialize session: ${c.status} ${_}`)}}catch(c){throw c.cause?.code==="ECONNREFUSED"||c.name==="TimeoutError"||c.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):c}console.log(v("UserPromptSubmit",!0))}var O="";U.on("data",p=>O+=p);U.on("end",async()=>{let p=O?JSON.parse(O):void 0;await K(p)});
|
||||
`;try{let l=this.db.prepare(E).all(d,c,...i),b=this.db.prepare(T).all(d,c,...i),_=this.db.prepare(S).all(d,c,...i);return{observations:l,sessions:b.map(p=>({id:p.id,sdk_session_id:p.sdk_session_id,project:p.project,request:p.request,completed:p.completed,next_steps:p.next_steps,created_at:p.created_at,created_at_epoch:p.created_at_epoch})),prompts:_.map(p=>({id:p.id,claude_session_id:p.claude_session_id,project:p.project,prompt:p.prompt_text,created_at:p.created_at,created_at_epoch:p.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(a,e,s){return a==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:a==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:a==="UserPromptSubmit"||a==="PostToolUse"?{continue:!0,suppressOutput:!0}:a==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function D(a,e,s={}){let t=$(a,e,s);return JSON.stringify(t)}import O from"path";import{homedir as W}from"os";import{existsSync as G,readFileSync as Y}from"fs";import{execSync as K}from"child_process";var V=100,q=100,J=1e4;function f(){try{let a=O.join(W(),".claude-mem","settings.json");if(G(a)){let e=JSON.parse(Y(a,"utf-8")),s=parseInt(e.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(s))return s}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}async function k(){try{let a=f();return(await fetch(`http://127.0.0.1:${a}/health`,{signal:AbortSignal.timeout(V)})).ok}catch{return!1}}async function Q(){let a=Date.now();for(;Date.now()-a<J;){if(await k())return!0;await new Promise(e=>setTimeout(e,q))}return!1}async function x(){if(await k())return;let a=v(),e=O.join(a,"node_modules",".bin","pm2"),s=O.join(a,"ecosystem.config.cjs");if(K(`"${e}" restart "${s}"`,{cwd:a,stdio:"pipe"}),!await Q())throw new Error("Worker failed to become healthy after restart")}async function Z(a){if(!a)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=a,r=z.basename(s);await x();let n=new g,o=n.createSDKSession(e,r,t),i=n.incrementPromptCounter(o);n.saveUserPrompt(e,i,t),console.error(`[new-hook] Session ${o}, prompt #${i}`),n.close();let d=f();try{let c=await fetch(`http://127.0.0.1:${d}/sessions/${o}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!c.ok){let E=await c.text();throw new Error(`Failed to initialize session: ${c.status} ${E}`)}}catch(c){throw c.cause?.code==="ECONNREFUSED"||c.name==="TimeoutError"||c.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):c}console.log(D("UserPromptSubmit",!0))}var I="";U.on("data",a=>I+=a);U.on("end",async()=>{let a=I?JSON.parse(I):void 0;await Z(a)});
|
||||
|
||||
+12
-12
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as U}from"process";import j from"better-sqlite3";import{join as E,dirname as F,basename as J}from"path";import{homedir as L}from"os";import{existsSync as ee,mkdirSync as X}from"fs";import{fileURLToPath as P}from"url";function B(){return typeof __dirname<"u"?__dirname:F(P(import.meta.url))}var H=B(),l=process.env.CLAUDE_MEM_DATA_DIR||E(L(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||E(L(),".claude"),te=E(l,"archives"),re=E(l,"logs"),ne=E(l,"trash"),oe=E(l,"backups"),ie=E(l,"settings.json"),A=E(l,"claude-mem.db"),ae=E(l,"vector-db"),de=E(h,"settings.json"),pe=E(h,"commands"),ce=E(h,"CLAUDE.md");function C(p){X(p,{recursive:!0})}function v(){return E(H,"..","..")}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),a=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let c="";n!=null&&(this.level===0&&typeof n=="object"?c=`
|
||||
`+JSON.stringify(n,null,2):c=" "+this.formatData(n));let m="";if(r){let{sessionId:T,sdkSessionId:g,correlationId:u,...d}=r;Object.keys(d).length>0&&(m=` {${Object.entries(d).map(([w,M])=>`${w}=${M}`).join(", ")}}`)}let S=`[${o}] [${i}] [${a}] ${_}${t}${m}${c}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},b=new O;var R=class{db;constructor(){C(l),this.db=new j(A),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as U}from"process";import j from"better-sqlite3";import{join as E,dirname as F,basename as te}from"path";import{homedir as C}from"os";import{existsSync as ie,mkdirSync as X}from"fs";import{fileURLToPath as H}from"url";function P(){return typeof __dirname<"u"?__dirname:F(H(import.meta.url))}var B=P(),l=process.env.CLAUDE_MEM_DATA_DIR||E(C(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||E(C(),".claude"),de=E(l,"archives"),pe=E(l,"logs"),ce=E(l,"trash"),_e=E(l,"backups"),ue=E(l,"settings.json"),v=E(l,"claude-mem.db"),Ee=E(l,"vector-db"),me=E(h,"settings.json"),le=E(h,"commands"),Te=E(h,"CLAUDE.md");function y(a){X(a,{recursive:!0})}function D(){return E(B,"..","..")}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),p=s.padEnd(6),u="";r?.correlationId?u=`[${r.correlationId}] `:r?.sessionId&&(u=`[session-${r.sessionId}] `);let c="";n!=null&&(this.level===0&&typeof n=="object"?c=`
|
||||
`+JSON.stringify(n,null,2):c=" "+this.formatData(n));let m="";if(r){let{sessionId:T,sdkSessionId:S,correlationId:_,...d}=r;Object.keys(d).length>0&&(m=` {${Object.entries(d).map(([M,w])=>`${M}=${w}`).join(", ")}}`)}let g=`[${o}] [${i}] [${p}] ${u}${t}${m}${c}`;e===3?console.error(g):console.log(g)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},b=new O;var R=class{db;constructor(){y(l),this.db=new j(v),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(a=>a.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(a=>a.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(a=>a.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(p=>p.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -261,7 +261,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,n=new Set;for(let o of t){if(o.files_read)try{let i=JSON.parse(o.files_read);Array.isArray(i)&&i.forEach(a=>r.add(a))}catch{}if(o.files_modified)try{let i=JSON.parse(o.files_modified);Array.isArray(i)&&i.forEach(a=>n.add(a))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,n=new Set;for(let o of t){if(o.files_read)try{let i=JSON.parse(o.files_read);Array.isArray(i)&&i.forEach(p=>r.add(p))}catch{}if(o.files_modified)try{let i=JSON.parse(o.files_modified);Array.isArray(i)&&i.forEach(p=>n.add(p))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -360,31 +360,31 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],a,_;if(e!==null){let T=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],p,u;if(e!==null){let T=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${o}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,g=`
|
||||
`,S=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let u=this.db.prepare(T).all(e,...i,t+1),d=this.db.prepare(g).all(e,...i,r+1);if(u.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};a=u.length>0?u[u.length-1].created_at_epoch:s,_=d.length>0?d[d.length-1].created_at_epoch:s}catch(u){return console.error("[SessionStore] Error getting boundary observations:",u.message),{observations:[],sessions:[],prompts:[]}}}else{let T=`
|
||||
`;try{let _=this.db.prepare(T).all(e,...i,t+1),d=this.db.prepare(S).all(e,...i,r+1);if(_.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};p=_.length>0?_[_.length-1].created_at_epoch:s,u=d.length>0?d[d.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary observations:",_.message),{observations:[],sessions:[],prompts:[]}}}else{let T=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,g=`
|
||||
`,S=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let u=this.db.prepare(T).all(s,...i,t),d=this.db.prepare(g).all(s,...i,r+1);if(u.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};a=u.length>0?u[u.length-1].created_at_epoch:s,_=d.length>0?d[d.length-1].created_at_epoch:s}catch(u){return console.error("[SessionStore] Error getting boundary timestamps:",u.message),{observations:[],sessions:[],prompts:[]}}}let c=`
|
||||
`;try{let _=this.db.prepare(T).all(s,...i,t),d=this.db.prepare(S).all(s,...i,r+1);if(_.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};p=_.length>0?_[_.length-1].created_at_epoch:s,u=d.length>0?d[d.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary timestamps:",_.message),{observations:[],sessions:[],prompts:[]}}}let c=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
@@ -394,10 +394,10 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,S=`
|
||||
`,g=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let T=this.db.prepare(c).all(a,_,...i),g=this.db.prepare(m).all(a,_,...i),u=this.db.prepare(S).all(a,_,...i);return{observations:T,sessions:g.map(d=>({id:d.id,sdk_session_id:d.sdk_session_id,project:d.project,request:d.request,completed:d.completed,next_steps:d.next_steps,created_at:d.created_at,created_at_epoch:d.created_at_epoch})),prompts:u.map(d=>({id:d.id,claude_session_id:d.claude_session_id,project:d.project,prompt:d.prompt_text,created_at:d.created_at,created_at_epoch:d.created_at_epoch}))}}catch(T){return console.error("[SessionStore] Error querying timeline records:",T.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(p,e,s){return p==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:p==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:p==="UserPromptSubmit"||p==="PostToolUse"?{continue:!0,suppressOutput:!0}:p==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function f(p,e,s={}){let t=$(p,e,s);return JSON.stringify(t)}import y from"path";import{spawn as D}from"child_process";var W=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);async function k(p=100){try{return(await fetch(`http://127.0.0.1:${W}/health`,{signal:AbortSignal.timeout(p)})).ok}catch{return!1}}async function G(p=1e4){let e=Date.now(),s=100;for(;Date.now()-e<p;){if(await k(1e3))return!0;await new Promise(t=>setTimeout(t,s))}return!1}async function x(){if(await k())return;let p=v(),e=y.join(p,"node_modules",".bin","pm2"),s=y.join(p,"ecosystem.config.cjs"),t=D(e,["list","--no-color"],{cwd:p,stdio:["ignore","pipe","ignore"]}),r="";if(t.stdout?.on("data",i=>{r+=i.toString()}),await new Promise((i,a)=>{t.on("error",_=>a(_)),t.on("close",_=>{i()})}),!(r.includes("claude-mem-worker")&&r.includes("online"))){let i=D(e,["start",s],{cwd:p,stdio:"ignore"});await new Promise((a,_)=>{i.on("error",c=>_(c)),i.on("close",c=>{c!==0&&c!==null?_(new Error(`PM2 start command failed with exit code ${c}`)):a()})})}if(!await G(1e4))throw new Error("Worker failed to become healthy after starting")}var Y=new Set(["ListMcpResourcesTool"]);async function K(p){if(!p)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=p;if(Y.has(s)){console.log(f("PostToolUse",!0));return}await x();let n=new R,o=n.createSDKSession(e,"",""),i=n.getPromptCounter(o);n.close();let a=b.formatTool(s,t),_=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);b.dataIn("HOOK",`PostToolUse: ${a}`,{sessionId:o,workerPort:_});try{let c=await fetch(`http://127.0.0.1:${_}/sessions/${o}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:i}),signal:AbortSignal.timeout(2e3)});if(!c.ok){let m=await c.text();throw b.failure("HOOK","Failed to send observation",{sessionId:o,status:c.status},m),new Error(`Failed to send observation to worker: ${c.status} ${m}`)}b.debug("HOOK","Observation sent successfully",{sessionId:o,toolName:s})}catch(c){throw c.cause?.code==="ECONNREFUSED"||c.name==="TimeoutError"||c.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):c}console.log(f("PostToolUse",!0))}var I="";U.on("data",p=>I+=p);U.on("end",async()=>{let p=I?JSON.parse(I):void 0;await K(p)});
|
||||
`;try{let T=this.db.prepare(c).all(p,u,...i),S=this.db.prepare(m).all(p,u,...i),_=this.db.prepare(g).all(p,u,...i);return{observations:T,sessions:S.map(d=>({id:d.id,sdk_session_id:d.sdk_session_id,project:d.project,request:d.request,completed:d.completed,next_steps:d.next_steps,created_at:d.created_at,created_at_epoch:d.created_at_epoch})),prompts:_.map(d=>({id:d.id,claude_session_id:d.claude_session_id,project:d.project,prompt:d.prompt_text,created_at:d.created_at,created_at_epoch:d.created_at_epoch}))}}catch(T){return console.error("[SessionStore] Error querying timeline records:",T.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(a,e,s){return a==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:a==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:a==="UserPromptSubmit"||a==="PostToolUse"?{continue:!0,suppressOutput:!0}:a==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function f(a,e,s={}){let t=$(a,e,s);return JSON.stringify(t)}import I from"path";import{homedir as W}from"os";import{existsSync as G,readFileSync as Y}from"fs";import{execSync as K}from"child_process";var V=100,q=100,J=1e4;function L(){try{let a=I.join(W(),".claude-mem","settings.json");if(G(a)){let e=JSON.parse(Y(a,"utf-8")),s=parseInt(e.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(s))return s}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}async function k(){try{let a=L();return(await fetch(`http://127.0.0.1:${a}/health`,{signal:AbortSignal.timeout(V)})).ok}catch{return!1}}async function Q(){let a=Date.now();for(;Date.now()-a<J;){if(await k())return!0;await new Promise(e=>setTimeout(e,q))}return!1}async function x(){if(await k())return;let a=D(),e=I.join(a,"node_modules",".bin","pm2"),s=I.join(a,"ecosystem.config.cjs");if(K(`"${e}" restart "${s}"`,{cwd:a,stdio:"pipe"}),!await Q())throw new Error("Worker failed to become healthy after restart")}var z=new Set(["ListMcpResourcesTool"]);async function Z(a){if(!a)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=a;if(z.has(s)){console.log(f("PostToolUse",!0));return}await x();let n=new R,o=n.createSDKSession(e,"",""),i=n.getPromptCounter(o);n.close();let p=b.formatTool(s,t),u=L();b.dataIn("HOOK",`PostToolUse: ${p}`,{sessionId:o,workerPort:u});try{let c=await fetch(`http://127.0.0.1:${u}/sessions/${o}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:i}),signal:AbortSignal.timeout(2e3)});if(!c.ok){let m=await c.text();throw b.failure("HOOK","Failed to send observation",{sessionId:o,status:c.status},m),new Error(`Failed to send observation to worker: ${c.status} ${m}`)}b.debug("HOOK","Observation sent successfully",{sessionId:o,toolName:s})}catch(c){throw c.cause?.code==="ECONNREFUSED"||c.name==="TimeoutError"||c.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):c}console.log(f("PostToolUse",!0))}var A="";U.on("data",a=>A+=a);U.on("end",async()=>{let a=A?JSON.parse(A):void 0;await Z(a)});
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as U}from"process";import j from"better-sqlite3";import{join as m,dirname as F,basename as V}from"path";import{homedir as f}from"os";import{existsSync as Z,mkdirSync as X}from"fs";import{fileURLToPath as P}from"url";function B(){return typeof __dirname<"u"?__dirname:F(P(import.meta.url))}var H=B(),E=process.env.CLAUDE_MEM_DATA_DIR||m(f(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||m(f(),".claude"),se=m(E,"archives"),te=m(E,"logs"),re=m(E,"trash"),ne=m(E,"backups"),oe=m(E,"settings.json"),L=m(E,"claude-mem.db"),ie=m(E,"vector-db"),ae=m(h,"settings.json"),de=m(h,"commands"),pe=m(h,"CLAUDE.md");function A(d){X(d,{recursive:!0})}function C(){return m(H,"..","..")}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),p=s.padEnd(6),c="";r?.correlationId?c=`[${r.correlationId}] `:r?.sessionId&&(c=`[session-${r.sessionId}] `);let u="";n!=null&&(this.level===0&&typeof n=="object"?u=`
|
||||
`+JSON.stringify(n,null,2):u=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:b,correlationId:_,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([w,M])=>`${w}=${M}`).join(", ")}}`)}let S=`[${o}] [${i}] [${p}] ${c}${t}${T}${u}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},g=new O;var R=class{db;constructor(){A(E),this.db=new j(L),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as U}from"process";import j from"better-sqlite3";import{join as _,dirname as F,basename as se}from"path";import{homedir as A}from"os";import{existsSync as oe,mkdirSync as X}from"fs";import{fileURLToPath as H}from"url";function P(){return typeof __dirname<"u"?__dirname:F(H(import.meta.url))}var B=P(),E=process.env.CLAUDE_MEM_DATA_DIR||_(A(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||_(A(),".claude"),ae=_(E,"archives"),de=_(E,"logs"),pe=_(E,"trash"),ce=_(E,"backups"),_e=_(E,"settings.json"),C=_(E,"claude-mem.db"),ue=_(E,"vector-db"),me=_(h,"settings.json"),Ee=_(h,"commands"),le=_(h,"CLAUDE.md");function y(a){X(a,{recursive:!0})}function v(){return _(B,"..","..")}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),p=s.padEnd(6),u="";r?.correlationId?u=`[${r.correlationId}] `:r?.sessionId&&(u=`[session-${r.sessionId}] `);let m="";n!=null&&(this.level===0&&typeof n=="object"?m=`
|
||||
`+JSON.stringify(n,null,2):m=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:b,correlationId:c,...d}=r;Object.keys(d).length>0&&(T=` {${Object.entries(d).map(([M,w])=>`${M}=${w}`).join(", ")}}`)}let g=`[${o}] [${i}] [${p}] ${u}${t}${T}${m}`;e===3?console.error(g):console.log(g)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},S=new O;var R=class{db;constructor(){y(E),this.db=new j(C),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -298,7 +298,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(g.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(S.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -317,23 +317,23 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let m=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(m.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let m=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(m.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -360,7 +360,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],p,c;if(e!==null){let l=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],p,u;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${o}
|
||||
@@ -372,7 +372,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let _=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(b).all(e,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=_.length>0?_[_.length-1].created_at_epoch:s,c=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary observations:",_.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
`;try{let c=this.db.prepare(l).all(e,...i,t+1),d=this.db.prepare(b).all(e,...i,r+1);if(c.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,u=d.length>0?d[d.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
@@ -384,7 +384,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let _=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(b).all(s,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=_.length>0?_[_.length-1].created_at_epoch:s,c=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary timestamps:",_.message),{observations:[],sessions:[],prompts:[]}}}let u=`
|
||||
`;try{let c=this.db.prepare(l).all(s,...i,t),d=this.db.prepare(b).all(s,...i,r+1);if(c.length===0&&d.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,u=d.length>0?d[d.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let m=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
@@ -394,10 +394,10 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,S=`
|
||||
`,g=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let l=this.db.prepare(u).all(p,c,...i),b=this.db.prepare(T).all(p,c,...i),_=this.db.prepare(S).all(p,c,...i);return{observations:l,sessions:b.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:_.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(d,e,s){return d==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:d==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:d==="UserPromptSubmit"||d==="PostToolUse"?{continue:!0,suppressOutput:!0}:d==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(d,e,s={}){let t=$(d,e,s);return JSON.stringify(t)}import y from"path";import{spawn as D}from"child_process";var W=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);async function k(d=100){try{return(await fetch(`http://127.0.0.1:${W}/health`,{signal:AbortSignal.timeout(d)})).ok}catch{return!1}}async function G(d=1e4){let e=Date.now(),s=100;for(;Date.now()-e<d;){if(await k(1e3))return!0;await new Promise(t=>setTimeout(t,s))}return!1}async function x(){if(await k())return;let d=C(),e=y.join(d,"node_modules",".bin","pm2"),s=y.join(d,"ecosystem.config.cjs"),t=D(e,["list","--no-color"],{cwd:d,stdio:["ignore","pipe","ignore"]}),r="";if(t.stdout?.on("data",i=>{r+=i.toString()}),await new Promise((i,p)=>{t.on("error",c=>p(c)),t.on("close",c=>{i()})}),!(r.includes("claude-mem-worker")&&r.includes("online"))){let i=D(e,["start",s],{cwd:d,stdio:"ignore"});await new Promise((p,c)=>{i.on("error",u=>c(u)),i.on("close",u=>{u!==0&&u!==null?c(new Error(`PM2 start command failed with exit code ${u}`)):p()})})}if(!await G(1e4))throw new Error("Worker failed to become healthy after starting")}async function Y(d){if(!d)throw new Error("summaryHook requires input");let{session_id:e}=d;await x();let s=new R,t=s.createSDKSession(e,"",""),r=s.getPromptCounter(t);s.close();let n=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);g.dataIn("HOOK","Stop: Requesting summary",{sessionId:t,workerPort:n,promptNumber:r});try{let o=await fetch(`http://127.0.0.1:${n}/sessions/${t}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:r}),signal:AbortSignal.timeout(2e3)});if(!o.ok){let i=await o.text();throw g.failure("HOOK","Failed to generate summary",{sessionId:t,status:o.status},i),new Error(`Failed to request summary from worker: ${o.status} ${i}`)}g.debug("HOOK","Summary request sent successfully",{sessionId:t})}catch(o){throw o.cause?.code==="ECONNREFUSED"||o.name==="TimeoutError"||o.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):o}console.log(v("Stop",!0))}var I="";U.on("data",d=>I+=d);U.on("end",async()=>{let d=I?JSON.parse(I):void 0;await Y(d)});
|
||||
`;try{let l=this.db.prepare(m).all(p,u,...i),b=this.db.prepare(T).all(p,u,...i),c=this.db.prepare(g).all(p,u,...i);return{observations:l,sessions:b.map(d=>({id:d.id,sdk_session_id:d.sdk_session_id,project:d.project,request:d.request,completed:d.completed,next_steps:d.next_steps,created_at:d.created_at,created_at_epoch:d.created_at_epoch})),prompts:c.map(d=>({id:d.id,claude_session_id:d.claude_session_id,project:d.project,prompt:d.prompt_text,created_at:d.created_at,created_at_epoch:d.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function W(a,e,s){return a==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:a==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:a==="UserPromptSubmit"||a==="PostToolUse"?{continue:!0,suppressOutput:!0}:a==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function D(a,e,s={}){let t=W(a,e,s);return JSON.stringify(t)}import f from"path";import{homedir as $}from"os";import{existsSync as G,readFileSync as Y}from"fs";import{execSync as K}from"child_process";var q=100,V=100,J=1e4;function I(){try{let a=f.join($(),".claude-mem","settings.json");if(G(a)){let e=JSON.parse(Y(a,"utf-8")),s=parseInt(e.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(s))return s}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}async function k(){try{let a=I();return(await fetch(`http://127.0.0.1:${a}/health`,{signal:AbortSignal.timeout(q)})).ok}catch{return!1}}async function Q(){let a=Date.now();for(;Date.now()-a<J;){if(await k())return!0;await new Promise(e=>setTimeout(e,V))}return!1}async function x(){if(await k())return;let a=v(),e=f.join(a,"node_modules",".bin","pm2"),s=f.join(a,"ecosystem.config.cjs");if(K(`"${e}" restart "${s}"`,{cwd:a,stdio:"pipe"}),!await Q())throw new Error("Worker failed to become healthy after restart")}async function z(a){if(!a)throw new Error("summaryHook requires input");let{session_id:e}=a;await x();let s=new R,t=s.createSDKSession(e,"",""),r=s.getPromptCounter(t);s.close();let n=I();S.dataIn("HOOK","Stop: Requesting summary",{sessionId:t,workerPort:n,promptNumber:r});try{let o=await fetch(`http://127.0.0.1:${n}/sessions/${t}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:r}),signal:AbortSignal.timeout(2e3)});if(!o.ok){let i=await o.text();throw S.failure("HOOK","Failed to generate summary",{sessionId:t,status:o.status},i),new Error(`Failed to request summary from worker: ${o.status} ${i}`)}S.debug("HOOK","Summary request sent successfully",{sessionId:t})}catch(o){throw o.cause?.code==="ECONNREFUSED"||o.name==="TimeoutError"||o.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):o}console.log(D("Stop",!0))}var L="";U.on("data",a=>L+=a);U.on("end",async()=>{let a=L?JSON.parse(L):void 0;await z(a)});
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
#!/usr/bin/env node
|
||||
import{execSync as r}from"child_process";import{join as o}from"path";import{homedir as t}from"os";import{existsSync as s}from"fs";var i=o(t(),".claude","plugins","marketplaces","thedotmack"),a=o(i,"node_modules");s(a)||(console.error(`
|
||||
import{execSync as _}from"child_process";import{join as i}from"path";import{homedir as p}from"os";import{existsSync as D}from"fs";import l from"path";import{homedir as f}from"os";import{existsSync as g,readFileSync as h}from"fs";import{join as e,dirname as m,basename as A}from"path";import{homedir as c}from"os";import{fileURLToPath as u}from"url";function d(){return typeof __dirname<"u"?__dirname:m(u(import.meta.url))}var w=d(),t=process.env.CLAUDE_MEM_DATA_DIR||e(c(),".claude-mem"),s=process.env.CLAUDE_CONFIG_DIR||e(c(),".claude"),P=e(t,"archives"),R=e(t,"logs"),S=e(t,"trash"),I=e(t,"backups"),b=e(t,"settings.json"),v=e(t,"claude-mem.db"),H=e(t,"vector-db"),L=e(s,"settings.json"),M=e(s,"commands"),U=e(s,"CLAUDE.md");function a(){try{let o=l.join(f(),".claude-mem","settings.json");if(g(o)){let n=JSON.parse(h(o,"utf-8")),r=parseInt(n.env?.CLAUDE_MEM_WORKER_PORT,10);if(!isNaN(r))return r}}catch{}return parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10)}var x=i(p(),".claude","plugins","marketplaces","thedotmack"),k=i(x,"node_modules");D(k)||(console.error(`
|
||||
---
|
||||
\u{1F389} Note: This appears under Plugin Hook Error, but it's not an error. That's the only option for
|
||||
user messages in Claude Code UI until a better method is provided.
|
||||
@@ -17,12 +17,12 @@ Dependencies have been installed in the background. This only happens once.
|
||||
Thank you for installing Claude-Mem!
|
||||
|
||||
This message was not added to your startup context, so you can continue working as normal.
|
||||
`),process.exit(3));try{let e=o(t(),".claude","plugins","marketplaces","thedotmack","plugin","scripts","context-hook.js"),n=r(`node "${e}" --colors`,{encoding:"utf8"});console.error(`
|
||||
`),process.exit(3));try{let o=i(p(),".claude","plugins","marketplaces","thedotmack","plugin","scripts","context-hook.js"),n=_(`node "${o}" --colors`,{encoding:"utf8"}),r=a();console.error(`
|
||||
|
||||
\u{1F4DD} Claude-Mem Context Loaded
|
||||
\u2139\uFE0F Note: This appears as stderr but is informational only
|
||||
|
||||
`+n+`
|
||||
|
||||
\u{1F4FA} Watch live in browser http://localhost:37777/ (New! v5.1)
|
||||
`)}catch(e){console.error(`\u274C Failed to load context display: ${e}`)}process.exit(3);
|
||||
\u{1F4FA} Watch live in browser http://localhost:${r}/ (New! v5.1)
|
||||
`)}catch(o){console.error(`\u274C Failed to load context display: ${o}`)}process.exit(3);
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -5,6 +5,7 @@
|
||||
|
||||
import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
export interface SessionEndInput {
|
||||
session_id: string;
|
||||
@@ -71,7 +72,7 @@ async function cleanupHook(input?: SessionEndInput): Promise<void> {
|
||||
|
||||
// Tell worker to stop spinner
|
||||
try {
|
||||
const workerPort = session.worker_port || 37777;
|
||||
const workerPort = session.worker_port || getWorkerPort();
|
||||
await fetch(`http://127.0.0.1:${workerPort}/sessions/${session.id}/complete`, {
|
||||
method: 'POST',
|
||||
signal: AbortSignal.timeout(1000)
|
||||
|
||||
@@ -7,7 +7,7 @@ import path from 'path';
|
||||
import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
export interface UserPromptSubmitInput {
|
||||
session_id: string;
|
||||
@@ -43,12 +43,11 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
|
||||
db.close();
|
||||
|
||||
// Use fixed worker port
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
const port = getWorkerPort();
|
||||
|
||||
try {
|
||||
// Initialize session via HTTP
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/init`, {
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ project, userPrompt: prompt }),
|
||||
|
||||
@@ -7,7 +7,7 @@ import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
export interface PostToolUseInput {
|
||||
session_id: string;
|
||||
@@ -50,16 +50,15 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
|
||||
const toolStr = logger.formatTool(tool_name, tool_input);
|
||||
|
||||
// Use fixed worker port
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
const port = getWorkerPort();
|
||||
|
||||
logger.dataIn('HOOK', `PostToolUse: ${toolStr}`, {
|
||||
sessionId: sessionDbId,
|
||||
workerPort: FIXED_PORT
|
||||
workerPort: port
|
||||
});
|
||||
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/observations`, {
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/observations`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
|
||||
@@ -7,7 +7,7 @@ import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
export interface StopInput {
|
||||
session_id: string;
|
||||
@@ -35,17 +35,16 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
const promptNumber = db.getPromptCounter(sessionDbId);
|
||||
db.close();
|
||||
|
||||
// Use fixed worker port
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
const port = getWorkerPort();
|
||||
|
||||
logger.dataIn('HOOK', 'Stop: Requesting summary', {
|
||||
sessionId: sessionDbId,
|
||||
workerPort: FIXED_PORT,
|
||||
workerPort: port,
|
||||
promptNumber
|
||||
});
|
||||
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/summarize`, {
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/summarize`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ prompt_number: promptNumber }),
|
||||
|
||||
@@ -10,6 +10,7 @@ import { execSync } from "child_process";
|
||||
import { join } from "path";
|
||||
import { homedir } from "os";
|
||||
import { existsSync } from "fs";
|
||||
import { getWorkerPort } from "../shared/worker-utils.js";
|
||||
|
||||
// Check if node_modules exists - if not, this is first run
|
||||
const pluginDir = join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack');
|
||||
@@ -46,11 +47,12 @@ try {
|
||||
encoding: 'utf8'
|
||||
});
|
||||
|
||||
const port = getWorkerPort();
|
||||
console.error(
|
||||
"\n\n📝 Claude-Mem Context Loaded\n" +
|
||||
" ℹ️ Note: This appears as stderr but is informational only\n\n" +
|
||||
output +
|
||||
"\n\n📺 Watch live in browser http://localhost:37777/ (New! v5.1)\n"
|
||||
`\n\n📺 Watch live in browser http://localhost:${port}/ (New! v5.1)\n`
|
||||
);
|
||||
|
||||
} catch (error) {
|
||||
|
||||
+66
-120
@@ -16,12 +16,18 @@ import { ensureAllDataDirs } from '../shared/paths.js';
|
||||
import { execSync } from 'child_process';
|
||||
import { readFileSync, writeFileSync, existsSync, statSync } from 'fs';
|
||||
import { join, dirname } from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import { homedir } from 'os';
|
||||
import { fileURLToPath } from 'url';
|
||||
import { getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
// Read version from package.json (works in both ESM and CJS after bundling)
|
||||
const packageJson = JSON.parse(readFileSync(join(__dirname, '..', '..', 'package.json'), 'utf-8'));
|
||||
const VERSION = packageJson.version;
|
||||
|
||||
const MODEL = process.env.CLAUDE_MEM_MODEL || 'claude-sonnet-4-5';
|
||||
const DISALLOWED_TOOLS = ['Glob', 'Grep', 'ListMcpResourcesTool', 'WebSearch'];
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
const MESSAGE_POLL_INTERVAL_MS = 100;
|
||||
const MAX_REQUEST_SIZE = '50mb';
|
||||
|
||||
/**
|
||||
* Cached Claude executable path
|
||||
@@ -97,7 +103,6 @@ interface ActiveSession {
|
||||
|
||||
class WorkerService {
|
||||
private app: express.Application;
|
||||
private port: number = FIXED_PORT;
|
||||
private sessions: Map<number, ActiveSession> = new Map();
|
||||
private chromaSync!: ChromaSync;
|
||||
private sseClients: Set<Response> = new Set();
|
||||
@@ -106,7 +111,7 @@ class WorkerService {
|
||||
|
||||
constructor() {
|
||||
this.app = express();
|
||||
this.app.use(express.json({ limit: '50mb' }));
|
||||
this.app.use(express.json({ limit: MAX_REQUEST_SIZE }));
|
||||
|
||||
// Serve static files for web UI (viewer-bundle.js, logos, etc.)
|
||||
const uiDir = this.getUIDirectory();
|
||||
@@ -140,12 +145,13 @@ class WorkerService {
|
||||
|
||||
async start(): Promise<void> {
|
||||
// Start HTTP server FIRST - nothing else matters until we can respond
|
||||
const port = getWorkerPort();
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
this.app.listen(FIXED_PORT, () => resolve())
|
||||
this.app.listen(port, () => resolve())
|
||||
.on('error', reject);
|
||||
});
|
||||
|
||||
logger.info('SYSTEM', 'Worker started', { port: FIXED_PORT, pid: process.pid });
|
||||
logger.info('SYSTEM', 'Worker started', { port, pid: process.pid });
|
||||
|
||||
// Initialize ChromaSync after HTTP is ready
|
||||
this.chromaSync = new ChromaSync('claude-mem');
|
||||
@@ -188,6 +194,48 @@ class WorkerService {
|
||||
return join(scriptDir, '..', 'ui');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get or create session state
|
||||
* Consolidates session lookup/creation logic used by init, observation, and summarize handlers
|
||||
*/
|
||||
private getOrCreateSession(sessionDbId: number): ActiveSession {
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (session) return session;
|
||||
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
if (!dbSession) {
|
||||
db.close();
|
||||
throw new Error(`Session ${sessionDbId} not found in database`);
|
||||
}
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession.project,
|
||||
userPrompt: dbSession.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
|
||||
db.close();
|
||||
return session;
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /health
|
||||
*/
|
||||
@@ -340,15 +388,14 @@ class WorkerService {
|
||||
|
||||
// Get worker stats
|
||||
const uptime = process.uptime();
|
||||
const version = process.env.npm_package_version || '5.0.3'; // fallback to current version
|
||||
|
||||
res.json({
|
||||
worker: {
|
||||
version,
|
||||
version: VERSION,
|
||||
uptime: Math.floor(uptime),
|
||||
activeSessions: this.sessions.size,
|
||||
sseClients: this.sseClients.size,
|
||||
port: this.port
|
||||
port: getWorkerPort()
|
||||
},
|
||||
database: {
|
||||
path: dbPath,
|
||||
@@ -403,13 +450,7 @@ class WorkerService {
|
||||
try {
|
||||
const { CLAUDE_MEM_MODEL, CLAUDE_MEM_CONTEXT_OBSERVATIONS, CLAUDE_MEM_WORKER_PORT } = req.body;
|
||||
|
||||
// Validate inputs
|
||||
const validModels = ['claude-haiku-4-5', 'claude-sonnet-4-5', 'claude-opus-4'];
|
||||
if (CLAUDE_MEM_MODEL && !validModels.includes(CLAUDE_MEM_MODEL)) {
|
||||
res.status(400).json({ success: false, error: `Invalid model name: ${CLAUDE_MEM_MODEL}` });
|
||||
return;
|
||||
}
|
||||
|
||||
// Validate inputs (SDK will handle model validation)
|
||||
if (CLAUDE_MEM_CONTEXT_OBSERVATIONS) {
|
||||
const obsCount = parseInt(CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10);
|
||||
if (isNaN(obsCount) || obsCount < 1 || obsCount > 200) {
|
||||
@@ -648,35 +689,12 @@ class WorkerService {
|
||||
|
||||
logger.info('WORKER', 'Session init', { sessionDbId, project });
|
||||
|
||||
// Fetch real Claude Code session ID from database
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
if (!dbSession) {
|
||||
db.close();
|
||||
res.status(404).json({ error: 'Session not found in database' });
|
||||
return;
|
||||
}
|
||||
|
||||
const claudeSessionId = dbSession.claude_session_id;
|
||||
|
||||
// Create session state
|
||||
const session: ActiveSession = {
|
||||
sessionDbId,
|
||||
claudeSessionId,
|
||||
sdkSessionId: null,
|
||||
project,
|
||||
userPrompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
|
||||
this.sessions.set(sessionDbId, session);
|
||||
const session = this.getOrCreateSession(sessionDbId);
|
||||
const claudeSessionId = session.claudeSessionId;
|
||||
|
||||
// Update port in database
|
||||
db.setWorkerPort(sessionDbId, this.port!);
|
||||
const db = new SessionStore();
|
||||
db.setWorkerPort(sessionDbId, getWorkerPort());
|
||||
|
||||
// Get the latest user_prompt for this session to sync to Chroma
|
||||
const latestPrompt = db.db.prepare(`
|
||||
@@ -723,23 +741,14 @@ class WorkerService {
|
||||
});
|
||||
}
|
||||
|
||||
// Start SDK agent in background
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
|
||||
// Start processing indicator (user submitted prompt)
|
||||
this.broadcastProcessingStatus(true);
|
||||
|
||||
logger.success('WORKER', 'Session initialized', { sessionId: sessionDbId, port: this.port });
|
||||
logger.success('WORKER', 'Session initialized', { sessionId: sessionDbId, port: getWorkerPort() });
|
||||
res.json({
|
||||
status: 'initialized',
|
||||
sessionDbId,
|
||||
port: this.port
|
||||
port: getWorkerPort()
|
||||
});
|
||||
}
|
||||
|
||||
@@ -751,39 +760,7 @@ class WorkerService {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { tool_name, tool_input, tool_output, prompt_number } = req.body;
|
||||
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (!session) {
|
||||
// Auto-create session if not in memory (worker restart, etc.)
|
||||
// Sessions are organizational metadata - observations are first-class data in vector store
|
||||
// Session ID comes from Claude Code hooks (guaranteed valid)
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
db.close();
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession!.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession!.project,
|
||||
userPrompt: dbSession!.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
// Start SDK agent in background
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
}
|
||||
|
||||
const session = this.getOrCreateSession(sessionDbId);
|
||||
const toolStr = logger.formatTool(tool_name, tool_input);
|
||||
|
||||
logger.dataIn('WORKER', `Observation queued: ${toolStr}`, {
|
||||
@@ -810,38 +787,7 @@ class WorkerService {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { prompt_number } = req.body;
|
||||
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (!session) {
|
||||
// Auto-create session if not in memory (worker restart, etc.)
|
||||
// Sessions are organizational metadata - observations are first-class data in vector store
|
||||
// Session ID comes from Claude Code hooks (guaranteed valid)
|
||||
const db = new SessionStore();
|
||||
const dbSession = db.getSessionById(sessionDbId);
|
||||
db.close();
|
||||
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession!.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
project: dbSession!.project,
|
||||
userPrompt: dbSession!.user_prompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
this.sessions.set(sessionDbId, session);
|
||||
|
||||
// Start SDK agent in background
|
||||
session.generatorPromise = this.runSDKAgent(session).catch(err => {
|
||||
logger.failure('WORKER', 'SDK agent error', { sessionId: sessionDbId }, err);
|
||||
const db = new SessionStore();
|
||||
db.markSessionFailed(sessionDbId);
|
||||
db.close();
|
||||
this.sessions.delete(sessionDbId);
|
||||
});
|
||||
}
|
||||
const session = this.getOrCreateSession(sessionDbId);
|
||||
|
||||
logger.dataIn('WORKER', 'Summary requested', {
|
||||
sessionId: sessionDbId,
|
||||
@@ -994,7 +940,7 @@ class WorkerService {
|
||||
}
|
||||
|
||||
if (session.pendingMessages.length === 0) {
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
await new Promise(resolve => setTimeout(resolve, MESSAGE_POLL_INTERVAL_MS));
|
||||
continue;
|
||||
}
|
||||
|
||||
|
||||
+40
-68
@@ -1,16 +1,40 @@
|
||||
import path from "path";
|
||||
import { spawn } from "child_process";
|
||||
import { homedir } from "os";
|
||||
import { existsSync, readFileSync } from "fs";
|
||||
import { execSync } from "child_process";
|
||||
import { getPackageRoot } from "./paths.js";
|
||||
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || "37777", 10);
|
||||
// Named constants for health checks
|
||||
const HEALTH_CHECK_TIMEOUT_MS = 100;
|
||||
const HEALTH_CHECK_POLL_INTERVAL_MS = 100;
|
||||
const HEALTH_CHECK_MAX_WAIT_MS = 10000;
|
||||
|
||||
/**
|
||||
* Get the worker port number
|
||||
* Priority: ~/.claude-mem/settings.json > env var > default
|
||||
*/
|
||||
export function getWorkerPort(): number {
|
||||
try {
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
if (existsSync(settingsPath)) {
|
||||
const settings = JSON.parse(readFileSync(settingsPath, 'utf-8'));
|
||||
const port = parseInt(settings.env?.CLAUDE_MEM_WORKER_PORT, 10);
|
||||
if (!isNaN(port)) return port;
|
||||
}
|
||||
} catch {
|
||||
// Fall through to env var or default
|
||||
}
|
||||
return parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if worker is responsive by trying the health endpoint
|
||||
*/
|
||||
async function isWorkerHealthy(timeoutMs: number = 100): Promise<boolean> {
|
||||
async function isWorkerHealthy(): Promise<boolean> {
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/health`, {
|
||||
signal: AbortSignal.timeout(timeoutMs)
|
||||
const port = getWorkerPort();
|
||||
const response = await fetch(`http://127.0.0.1:${port}/health`, {
|
||||
signal: AbortSignal.timeout(HEALTH_CHECK_TIMEOUT_MS)
|
||||
});
|
||||
return response.ok;
|
||||
} catch {
|
||||
@@ -21,89 +45,37 @@ async function isWorkerHealthy(timeoutMs: number = 100): Promise<boolean> {
|
||||
/**
|
||||
* Wait for worker to become healthy
|
||||
*/
|
||||
async function waitForWorkerHealth(maxWaitMs: number = 10000): Promise<boolean> {
|
||||
async function waitForWorkerHealth(): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
const checkInterval = 100; // Check every 100ms
|
||||
|
||||
while (Date.now() - start < maxWaitMs) {
|
||||
if (await isWorkerHealthy(1000)) {
|
||||
|
||||
while (Date.now() - start < HEALTH_CHECK_MAX_WAIT_MS) {
|
||||
if (await isWorkerHealthy()) {
|
||||
return true;
|
||||
}
|
||||
// Wait before next check
|
||||
await new Promise(resolve => setTimeout(resolve, checkInterval));
|
||||
await new Promise(resolve => setTimeout(resolve, HEALTH_CHECK_POLL_INTERVAL_MS));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure worker service is running
|
||||
* Checks if worker is already running before attempting to start
|
||||
* This prevents unnecessary restarts that could interrupt mid-action processing
|
||||
* If unhealthy, restarts PM2 and waits for health
|
||||
*/
|
||||
export async function ensureWorkerRunning(): Promise<void> {
|
||||
// First, check if worker is already healthy
|
||||
if (await isWorkerHealthy()) {
|
||||
return; // Worker is already running and responsive
|
||||
return;
|
||||
}
|
||||
|
||||
const packageRoot = getPackageRoot();
|
||||
const pm2Path = path.join(packageRoot, "node_modules", ".bin", "pm2");
|
||||
const ecosystemPath = path.join(packageRoot, "ecosystem.config.cjs");
|
||||
|
||||
// Check PM2 status to see if worker process exists
|
||||
const checkProcess = spawn(pm2Path, ["list", "--no-color"], {
|
||||
execSync(`"${pm2Path}" restart "${ecosystemPath}"`, {
|
||||
cwd: packageRoot,
|
||||
stdio: ["ignore", "pipe", "ignore"],
|
||||
stdio: 'pipe'
|
||||
});
|
||||
|
||||
let output = "";
|
||||
checkProcess.stdout?.on("data", (data) => {
|
||||
output += data.toString();
|
||||
});
|
||||
|
||||
// Wait for PM2 list to complete
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
checkProcess.on("error", (error) => reject(error));
|
||||
checkProcess.on("close", (code) => {
|
||||
// PM2 list can fail, but we should still continue - just assume worker isn't running
|
||||
// This handles cases where PM2 isn't installed yet
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
|
||||
// Check if 'claude-mem-worker' is in the PM2 list output and is 'online'
|
||||
const isRunning = output.includes("claude-mem-worker") && output.includes("online");
|
||||
|
||||
if (!isRunning) {
|
||||
// Start the worker
|
||||
const startProcess = spawn(pm2Path, ["start", ecosystemPath], {
|
||||
cwd: packageRoot,
|
||||
stdio: "ignore",
|
||||
});
|
||||
|
||||
// Wait for PM2 start command to complete
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
startProcess.on("error", (error) => reject(error));
|
||||
startProcess.on("close", (code) => {
|
||||
if (code !== 0 && code !== null) {
|
||||
reject(new Error(`PM2 start command failed with exit code ${code}`));
|
||||
} else {
|
||||
resolve();
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Wait for worker to become healthy (either just started or was starting)
|
||||
const healthy = await waitForWorkerHealth(10000);
|
||||
if (!healthy) {
|
||||
throw new Error("Worker failed to become healthy after starting");
|
||||
if (!await waitForWorkerHealth()) {
|
||||
throw new Error("Worker failed to become healthy after restart");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the worker port number (fixed port)
|
||||
*/
|
||||
export function getWorkerPort(): number {
|
||||
return FIXED_PORT;
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user