Merge pull request #476 from thedotmack/bugfix/win-revert-cc-path-to-executable
feat(queue): Simplify queue processing and enhance reliability
This commit is contained in:
@@ -0,0 +1,106 @@
|
||||
# Queue System Simplification Plan
|
||||
|
||||
## 1. Executive Summary
|
||||
The current queue system suffers from accidental complexity due to **state duplication** (in-memory vs. database), **fragile control flow** (recursive restarts), and **distributed state management**. This plan proposes a refactoring to establish the Database as the Single Source of Truth, unifying the processing logic into a robust, linear "Pump" model.
|
||||
|
||||
## 2. Identified Pain Points
|
||||
|
||||
1. **Dual State Synchronization:**
|
||||
* *Issue:* The system maintains both `session.pendingMessages` (in-memory array) and the `pending_messages` SQLite table.
|
||||
* *Impact:* Requires constant manual synchronization (push/shift/enqueue), leading to race conditions where the in-memory queue drifts from the DB state.
|
||||
|
||||
2. **Fragile Generator Lifecycle:**
|
||||
* *Issue:* The use of `startGeneratorWithProvider` and `startSessionWithAutoRestart` with recursive `setTimeout` calls to keep the processor alive is brittle.
|
||||
* *Impact:* Hard to debug, prone to stack issues or silent failures if the "chain" breaks.
|
||||
|
||||
3. **Non-Atomic State Transitions:**
|
||||
* *Issue:* The logic separates "peeking" a message from "marking it processing" (the "Critical Flow" identified in the analysis).
|
||||
* *Impact:* If the worker crashes or halts between these steps, messages can be processed twice or lost in limbo.
|
||||
|
||||
4. **Distributed Logic:**
|
||||
* *Issue:* Queue logic is scattered across `SessionManager` (coordination), `PendingMessageStore` (DB queries), `SDKAgent` (consumption), and `WorkerService` (orchestration).
|
||||
* *Impact:* Difficult to trace the lifecycle of a single message.
|
||||
|
||||
## 3. Proposed Architecture
|
||||
|
||||
### 3.1. Core Principle: "The Database is the Queue"
|
||||
We will eliminate the in-memory `pendingMessages` array entirely. The SQLite database will be the *only* place where queue state exists.
|
||||
|
||||
### 3.2. Architecture Components
|
||||
|
||||
#### A. Atomic `claimNextMessage()`
|
||||
Instead of `peek` then `mark`, we will implement a single atomic operation in `PendingMessageStore`.
|
||||
|
||||
* **Logic:**
|
||||
1. Find the oldest `pending` message for the session.
|
||||
2. Update it to `processing` and set the timestamp.
|
||||
3. Return the message record.
|
||||
* **SQL Strategy:** Use a transaction or `UPDATE ... RETURNING` (if supported) to ensure no other worker can claim the same message.
|
||||
|
||||
#### B. The `QueuePump` (Unified Processor)
|
||||
We will replace the recursive generator logic with a class (or function) dedicated to "pumping" messages for a specific session.
|
||||
|
||||
* **Pseudocode Structure:**
|
||||
```typescript
|
||||
async function runSessionPump(sessionId: number, signal: AbortSignal) {
|
||||
while (!signal.aborted) {
|
||||
// 1. Atomic Claim
|
||||
const message = store.claimNextMessage(sessionId);
|
||||
|
||||
if (!message) {
|
||||
// 2. Wait for signal (Event-driven, not polling)
|
||||
await waitForNewData(sessionId, signal);
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
// 3. Process
|
||||
await sdkAgent.processMessage(message);
|
||||
|
||||
// 4. Mark Complete
|
||||
store.markProcessed(message.id);
|
||||
} catch (error) {
|
||||
// 5. Handle Failure
|
||||
store.markFailed(message.id, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3.3. Key Changes
|
||||
|
||||
| Component | Current State | Proposed State |
|
||||
| :--- | :--- | :--- |
|
||||
| **Storage** | In-memory Array + SQLite | SQLite Only |
|
||||
| **Consumption** | `yield` loop inside SDK Agent | `QueuePump` calls SDK Agent per message |
|
||||
| **Concurrency** | `peekPending` -> `markProcessing` (Race Prone) | `claimNextMessage` (Atomic Transaction) |
|
||||
| **Lifecycle** | Recursive `setTimeout` loops | Single `while` loop with `await` |
|
||||
| **Recovery** | `resetStuckMessages` (Global) | Pump handles own retries + Global cleanup on startup |
|
||||
|
||||
## 4. Implementation Steps
|
||||
|
||||
### Phase 1: Database Layer Hardening
|
||||
1. Add `claimNextMessage(sessionDbId)` to `PendingMessageStore`.
|
||||
* Must be transactional.
|
||||
* Returns `null` if no work is available.
|
||||
2. Ensure `markProcessed` and `markFailed` are robust.
|
||||
|
||||
### Phase 2: The Pump
|
||||
1. Create `SessionQueueProcessor.ts`.
|
||||
2. Implement the `while(!aborted)` loop.
|
||||
3. Integrate the `EventEmitter` to wake the loop when `enqueue()` happens (replacing the current polling-like behavior).
|
||||
|
||||
### Phase 3: SDK Integration
|
||||
1. Refactor `SDKAgent` to accept a *single* message or a streamlined iterator that doesn't manage queue state itself.
|
||||
2. Remove `session.pendingMessages` from `ActiveSession` type.
|
||||
|
||||
### Phase 4: Cleanup
|
||||
1. Remove `startGeneratorWithProvider` and `startSessionWithAutoRestart`.
|
||||
2. Remove `peekPending` (as it's replaced by `claimNextMessage`).
|
||||
3. Remove manual synchronization code in `SessionManager`.
|
||||
|
||||
## 5. Benefits
|
||||
* **Simplicity:** Code reduction of ~30-40%.
|
||||
* **Reliability:** Atomic database operations eliminate race conditions.
|
||||
* **Observability:** Linear control flow is easier to log and debug.
|
||||
* **Resilience:** Crashes are handled by simply restarting the Pump, which naturally picks up "processing" (stuck) or "pending" messages.
|
||||
@@ -0,0 +1,46 @@
|
||||
# Queue System Simplification Implementation
|
||||
|
||||
I have successfully implemented the queue system simplification plan.
|
||||
|
||||
## Changes Implemented
|
||||
|
||||
### 1. Database Layer Hardening
|
||||
- **Added `claimNextMessage(sessionDbId)` to `PendingMessageStore`:**
|
||||
- Implements an atomic transaction (SELECT oldest pending + UPDATE to processing).
|
||||
- Ensures a message can only be claimed by one worker at a time.
|
||||
- Eliminates race conditions between "peeking" and "marking".
|
||||
- **Removed `peekPending()`:**
|
||||
- No longer needed as `claimNextMessage` handles retrieval and locking in one step.
|
||||
|
||||
### 2. Unified "Pump" Architecture
|
||||
- **Created `src/services/queue/SessionQueueProcessor.ts`:**
|
||||
- Implements a robust `AsyncIterableIterator` that yields messages.
|
||||
- Encapsulates the "Claim -> Yield -> Wait" loop.
|
||||
- Replaces fragile polling/recursive logic with event-driven `waitForMessage`.
|
||||
- Handles empty queues gracefully by waiting for signals.
|
||||
|
||||
### 3. SessionManager Refactoring
|
||||
- **Updated `getMessageIterator`:**
|
||||
- Now delegates to `SessionQueueProcessor`.
|
||||
- Removes complex manual synchronization logic.
|
||||
- **Removed In-Memory Queue State:**
|
||||
- `queueObservation` and `queueSummarize` now only write to DB and emit events.
|
||||
- `pendingMessages` array is no longer used for logic (kept deprecated for type compatibility).
|
||||
- `getTotalActiveWork`, `hasPendingMessages`, etc., now query `PendingMessageStore` directly (counting both 'pending' and 'processing' states).
|
||||
|
||||
### 4. Logic Cleanup
|
||||
- **Removed Recursive Restarts:**
|
||||
- Refactored `startGeneratorWithProvider` in `SessionRoutes.ts` and `startSessionProcessor` in `WorkerService.ts`.
|
||||
- Removed logic that deleted sessions when queue emptied (sessions now wait for new work).
|
||||
- Removed "auto-restart" logic for normal completion (only kept for crash recovery).
|
||||
|
||||
## Benefits
|
||||
- **Reliability:** Atomic DB operations prevent stuck or duplicate messages.
|
||||
- **Simplicity:** Removed complex "peek-then-mark" and recursive restart chains.
|
||||
- **Performance:** Zero-latency event notification with efficient DB queries.
|
||||
- **Maintainability:** Clear separation of concerns (Store vs Processor vs Manager).
|
||||
|
||||
## Verification
|
||||
- Ran static analysis (`tsc`) to verify type safety of new components.
|
||||
- Verified removal of dead code (`peekPending`).
|
||||
- Confirmed integration points in `SessionManager` and `SessionRoutes`.
|
||||
@@ -0,0 +1,742 @@
|
||||
# Queue System Logic Report
|
||||
|
||||
This document provides a line-by-line analysis of the queue system in claude-mem, explaining **the reason behind each piece of logic** and **what it actually does**.
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [High-Level Architecture](#high-level-architecture)
|
||||
2. [Message Status State Machine](#message-status-state-machine)
|
||||
3. [PendingMessageStore (Database Layer)](#pendingmessagestore-database-layer)
|
||||
4. [SessionManager (Queue Coordination)](#sessionmanager-queue-coordination)
|
||||
5. [SDKAgent (Message Consumer)](#sdkagent-message-consumer)
|
||||
6. [SessionRoutes (HTTP Entry Points)](#sessionroutes-http-entry-points)
|
||||
7. [WorkerService (Orchestrator)](#workerservice-orchestrator)
|
||||
8. [Critical Flow: How a Message Gets Stuck in "Processing"](#critical-flow-how-a-message-gets-stuck-in-processing)
|
||||
9. [Recovery Mechanisms](#recovery-mechanisms)
|
||||
|
||||
---
|
||||
|
||||
## High-Level Architecture
|
||||
|
||||
```
|
||||
Hook (post-tool-use/summary)
|
||||
│
|
||||
▼
|
||||
SessionRoutes.handleObservations/handleSummarize
|
||||
│
|
||||
▼
|
||||
SessionManager.queueObservation/queueSummarize
|
||||
│
|
||||
├─► PendingMessageStore.enqueue() [DB: status='pending']
|
||||
│
|
||||
├─► session.pendingMessages.push() [In-memory queue]
|
||||
│
|
||||
└─► emitter.emit('message') [Wake up generator]
|
||||
|
||||
│
|
||||
▼
|
||||
SDKAgent.createMessageGenerator (async generator)
|
||||
│
|
||||
├─► SessionManager.getMessageIterator()
|
||||
│ │
|
||||
│ ├─► PendingMessageStore.peekPending() [Find oldest pending]
|
||||
│ │
|
||||
│ ├─► PendingMessageStore.markProcessing() [DB: status='processing']
|
||||
│ │
|
||||
│ └─► yield message to SDK
|
||||
│
|
||||
▼
|
||||
SDK query() processes message and returns response
|
||||
│
|
||||
▼
|
||||
SDKAgent.processSDKResponse()
|
||||
│
|
||||
└─► SDKAgent.markMessagesProcessed()
|
||||
│
|
||||
└─► PendingMessageStore.markProcessed() [DB: status='processed']
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Message Status State Machine
|
||||
|
||||
```
|
||||
┌─────────────┐
|
||||
│ (new) │
|
||||
└──────┬──────┘
|
||||
│ enqueue()
|
||||
▼
|
||||
┌─────────────┐
|
||||
┌────│ pending │◄───────────────┐
|
||||
│ └──────┬──────┘ │
|
||||
│ │ markProcessing() │ markFailed() [retry_count < maxRetries]
|
||||
│ ▼ │
|
||||
│ ┌─────────────┐ │
|
||||
│ │ processing │────────────────┤
|
||||
│ └──────┬──────┘ │
|
||||
│ │ │
|
||||
│ ├─► markProcessed() │
|
||||
│ │ │ │
|
||||
│ │ ▼ │
|
||||
│ │ ┌─────────────┐ │
|
||||
│ │ │ processed │ │
|
||||
│ │ └─────────────┘ │
|
||||
│ │ │
|
||||
│ └─► markFailed() [retry_count >= maxRetries]
|
||||
│ │
|
||||
│ ▼
|
||||
│ ┌─────────────┐
|
||||
│ │ failed │
|
||||
│ └─────────────┘
|
||||
│
|
||||
│
|
||||
│ resetStuckMessages() [thresholdMs timeout]
|
||||
└───────────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## PendingMessageStore (Database Layer)
|
||||
|
||||
### `enqueue()` (Lines 56-82)
|
||||
|
||||
```typescript
|
||||
enqueue(sessionDbId: number, claudeSessionId: string, message: PendingMessage): number {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO pending_messages (
|
||||
session_db_id, claude_session_id, message_type,
|
||||
tool_name, tool_input, tool_response, cwd,
|
||||
last_user_message, last_assistant_message,
|
||||
prompt_number, status, retry_count, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 'pending', 0, ?)
|
||||
`);
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `const now = Date.now()` | Messages need timestamps for ordering and stuck-detection | Captures the moment the message was queued |
|
||||
| `status, retry_count ... 'pending', 0` | New messages start in pending state with no retries | Hard-codes initial state in SQL |
|
||||
| `created_at_epoch` | Need to track when message was originally queued for accurate observation timestamps | Used later when processing backlog to assign correct timestamps to observations |
|
||||
| `JSON.stringify(message.tool_input)` | SQLite can't store objects natively | Serializes complex tool data to string |
|
||||
| Returns `lastInsertRowid` | Caller needs the ID to track this specific message | Returns the database-assigned auto-increment ID |
|
||||
|
||||
### `peekPending()` (Lines 88-96)
|
||||
|
||||
```typescript
|
||||
peekPending(sessionDbId: number): PersistentPendingMessage | null {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT * FROM pending_messages
|
||||
WHERE session_db_id = ? AND status = 'pending'
|
||||
ORDER BY id ASC
|
||||
LIMIT 1
|
||||
`);
|
||||
return stmt.get(sessionDbId) as PersistentPendingMessage | null;
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `status = 'pending'` | Only look at messages not yet being processed | Filters out processing/processed/failed |
|
||||
| `ORDER BY id ASC` | Process messages in the order they arrived (FIFO) | Uses auto-increment ID as natural ordering |
|
||||
| `LIMIT 1` | Only need one message at a time for the iterator | Returns single oldest pending message |
|
||||
| Does NOT change status | Peek is non-destructive; status change happens separately in markProcessing | Allows checking without committing to process |
|
||||
|
||||
### `markProcessing()` (Lines 216-224)
|
||||
|
||||
```typescript
|
||||
markProcessing(messageId: number): void {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'processing', started_processing_at_epoch = ?
|
||||
WHERE id = ? AND status = 'pending'
|
||||
`);
|
||||
stmt.run(now, messageId);
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `status = 'processing'` | Mark this message as "in progress" so other consumers don't pick it up | Prevents duplicate processing |
|
||||
| `started_processing_at_epoch = ?` | Track when processing started for stuck detection | If processing takes >5min, considered stuck |
|
||||
| `WHERE ... AND status = 'pending'` | Only transition from pending->processing (idempotent safety) | Prevents double-processing race conditions |
|
||||
|
||||
### `markProcessed()` (Lines 230-242)
|
||||
|
||||
```typescript
|
||||
markProcessed(messageId: number): void {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET
|
||||
status = 'processed',
|
||||
completed_at_epoch = ?,
|
||||
tool_input = NULL,
|
||||
tool_response = NULL
|
||||
WHERE id = ? AND status = 'processing'
|
||||
`);
|
||||
stmt.run(now, messageId);
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `status = 'processed'` | Message successfully handled, move to terminal state | Marks completion |
|
||||
| `completed_at_epoch = ?` | Track when processing finished for metrics/display | Records completion time |
|
||||
| `tool_input = NULL, tool_response = NULL` | Large payload data no longer needed after successful processing | Frees space - observations are already saved elsewhere |
|
||||
| `WHERE ... AND status = 'processing'` | Only transition from processing->processed | Ensures we only complete messages we actually processed |
|
||||
|
||||
### `markFailed()` (Lines 249-274)
|
||||
|
||||
```typescript
|
||||
markFailed(messageId: number): void {
|
||||
const msg = this.db.prepare('SELECT retry_count FROM pending_messages WHERE id = ?').get(messageId);
|
||||
|
||||
if (msg.retry_count < this.maxRetries) {
|
||||
// Move back to pending for retry
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'pending', retry_count = retry_count + 1, started_processing_at_epoch = NULL
|
||||
WHERE id = ?
|
||||
`);
|
||||
} else {
|
||||
// Max retries exceeded, mark as permanently failed
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'failed', completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Check `retry_count < maxRetries` | Don't retry forever - eventually give up | Implements bounded retry policy (default: 3) |
|
||||
| `status = 'pending'` (retry path) | Put message back in queue for another attempt | Allows automatic recovery |
|
||||
| `retry_count + 1` | Track how many times we've tried | Increment toward failure threshold |
|
||||
| `started_processing_at_epoch = NULL` | Clear the processing timestamp for next attempt | Prevents stuck detection from immediately triggering |
|
||||
| `status = 'failed'` (terminal) | Message is permanently broken, stop trying | Prevents infinite retry loops |
|
||||
|
||||
### `resetStuckMessages()` (Lines 281-292)
|
||||
|
||||
```typescript
|
||||
resetStuckMessages(thresholdMs: number): number {
|
||||
const cutoff = thresholdMs === 0 ? Date.now() : Date.now() - thresholdMs;
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'pending', started_processing_at_epoch = NULL
|
||||
WHERE status = 'processing' AND started_processing_at_epoch < ?
|
||||
`);
|
||||
|
||||
return result.changes;
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `thresholdMs === 0 ? Date.now()` | Special case: threshold=0 means "reset all processing messages" | Allows forced recovery of all stuck messages |
|
||||
| `Date.now() - thresholdMs` | Calculate cutoff time (e.g., 5 minutes ago) | Messages processing longer than this are stuck |
|
||||
| `status = 'processing'` condition | Only reset messages actively being processed | Don't touch pending or completed messages |
|
||||
| `started_processing_at_epoch < ?` | Processing started before cutoff = stuck | Time-based stuck detection |
|
||||
| `SET status = 'pending'` | Move back to queue for retry | Enables automatic recovery |
|
||||
| Returns `result.changes` | Caller needs to know how many were recovered | For logging/metrics |
|
||||
|
||||
### `getPendingCount()` (Lines 297-304)
|
||||
|
||||
```typescript
|
||||
getPendingCount(sessionDbId: number): number {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT COUNT(*) as count FROM pending_messages
|
||||
WHERE session_db_id = ? AND status IN ('pending', 'processing')
|
||||
`);
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `status IN ('pending', 'processing')` | **CRITICAL**: Counts BOTH pending AND processing | Used to decide if generator should keep running |
|
||||
| Why include processing? | A message in processing state is still "work to be done" | Prevents generator from stopping while SDK is mid-response |
|
||||
|
||||
---
|
||||
|
||||
## SessionManager (Queue Coordination)
|
||||
|
||||
### `queueObservation()` (Lines 181-232)
|
||||
|
||||
```typescript
|
||||
queueObservation(sessionDbId: number, data: ObservationData): void {
|
||||
// Auto-initialize from database if needed
|
||||
let session = this.sessions.get(sessionDbId);
|
||||
if (!session) {
|
||||
session = this.initializeSession(sessionDbId);
|
||||
}
|
||||
|
||||
// CRITICAL: Persist to database FIRST
|
||||
const message: PendingMessage = { type: 'observation', ... };
|
||||
const messageId = this.getPendingStore().enqueue(sessionDbId, session.claudeSessionId, message);
|
||||
|
||||
// Add to in-memory queue
|
||||
session.pendingMessages.push(message);
|
||||
|
||||
// Notify generator immediately
|
||||
const emitter = this.sessionQueues.get(sessionDbId);
|
||||
emitter?.emit('message');
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Auto-initialize session | Worker may have restarted, need to rebuild in-memory state | Lazy initialization from database |
|
||||
| `enqueue()` BEFORE in-memory push | **CRITICAL**: Database is source of truth, survives crashes | Persist-first ensures no data loss |
|
||||
| `session.pendingMessages.push()` | In-memory queue for backward compatibility and fast status checks | Mirrors database state in RAM |
|
||||
| `emitter?.emit('message')` | Wake up the generator immediately (zero-latency) | Event-driven, no polling needed |
|
||||
|
||||
### `getMessageIterator()` (Lines 397-477)
|
||||
|
||||
```typescript
|
||||
async *getMessageIterator(sessionDbId: number): AsyncIterableIterator<PendingMessageWithId> {
|
||||
while (!session.abortController.signal.aborted) {
|
||||
// Check for pending messages in persistent store
|
||||
const persistentMessage = this.getPendingStore().peekPending(sessionDbId);
|
||||
|
||||
if (!persistentMessage) {
|
||||
// Wait for new message event
|
||||
await new Promise<void>(resolve => {
|
||||
emitter.once('message', messageHandler);
|
||||
session.abortController.signal.addEventListener('abort', abortHandler, { once: true });
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
// Mark as processing BEFORE yielding
|
||||
this.getPendingStore().markProcessing(persistentMessage.id);
|
||||
|
||||
// Track this message ID for completion marking
|
||||
session.pendingProcessingIds.add(persistentMessage.id);
|
||||
|
||||
// Convert and yield
|
||||
const message: PendingMessageWithId = {
|
||||
_persistentId: persistentMessage.id,
|
||||
_originalTimestamp: persistentMessage.created_at_epoch,
|
||||
...this.getPendingStore().toPendingMessage(persistentMessage)
|
||||
};
|
||||
|
||||
yield message;
|
||||
|
||||
// Remove from in-memory queue after yielding
|
||||
session.pendingMessages.shift();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `while (!aborted)` | Keep processing until session ends | Continuous processing loop |
|
||||
| `peekPending()` | Check database for work | Non-destructively looks for pending messages |
|
||||
| `await new Promise` with event | Block until message arrives (no polling) | Event-driven wake-up saves CPU |
|
||||
| `markProcessing()` BEFORE yield | **CRITICAL**: Claim the message before giving to SDK | Prevents race conditions |
|
||||
| `pendingProcessingIds.add()` | Track which messages are being processed | So we know what to mark as completed |
|
||||
| `_persistentId` field | Attach database ID to in-flight message | Needed for markProcessed() later |
|
||||
| `_originalTimestamp` | Preserve original queue time | For accurate observation timestamps when processing backlog |
|
||||
| `pendingMessages.shift()` after yield | Keep in-memory queue in sync with database | Mirrors the database state change |
|
||||
|
||||
---
|
||||
|
||||
## SDKAgent (Message Consumer)
|
||||
|
||||
### `startSession()` Main Loop (Lines 75-150)
|
||||
|
||||
```typescript
|
||||
const queryResult = query({
|
||||
prompt: messageGenerator,
|
||||
options: {
|
||||
model: modelId,
|
||||
resume: session.claudeSessionId, // <-- Session continuity
|
||||
disallowedTools,
|
||||
abortController: session.abortController,
|
||||
pathToClaudeCodeExecutable: claudePath
|
||||
}
|
||||
});
|
||||
|
||||
for await (const message of queryResult) {
|
||||
if (message.type === 'assistant') {
|
||||
// Process response
|
||||
await this.processSDKResponse(session, textContent, worker, discoveryTokens, originalTimestamp);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `resume: session.claudeSessionId` | **CRITICAL**: Connect to existing Claude session | Enables session continuity - same transcript across prompts |
|
||||
| `for await` loop | Process SDK responses as they arrive | Streaming response handling |
|
||||
| `processSDKResponse()` called per response | Parse and save observations/summaries | Database + Chroma sync |
|
||||
|
||||
### `createMessageGenerator()` (Lines 202-291)
|
||||
|
||||
```typescript
|
||||
private async *createMessageGenerator(session: ActiveSession): AsyncIterableIterator<SDKUserMessage> {
|
||||
// Build initial or continuation prompt
|
||||
const initPrompt = isInitPrompt
|
||||
? buildInitPrompt(...)
|
||||
: buildContinuationPrompt(...);
|
||||
|
||||
// Yield initial prompt
|
||||
yield { type: 'user', message: { role: 'user', content: initPrompt }, session_id: session.claudeSessionId };
|
||||
|
||||
// Consume pending messages
|
||||
for await (const message of this.sessionManager.getMessageIterator(session.sessionDbId)) {
|
||||
if (message.type === 'observation') {
|
||||
const obsPrompt = buildObservationPrompt({ ... });
|
||||
yield { type: 'user', message: { role: 'user', content: obsPrompt } };
|
||||
} else if (message.type === 'summarize') {
|
||||
const summaryPrompt = buildSummaryPrompt({ ... });
|
||||
yield { type: 'user', message: { role: 'user', content: summaryPrompt } };
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `isInitPrompt` check | First prompt needs full context, subsequent prompts need continuation | Different prompt templates |
|
||||
| `yield` initial prompt | Start the SDK conversation | Sends initialization to Claude |
|
||||
| `for await ... getMessageIterator` | Pull messages as they become available | Event-driven message consumption |
|
||||
| `yield` for each message | Feed observations/summaries to SDK one at a time | SDK processes each and responds |
|
||||
|
||||
### `markMessagesProcessed()` (Lines 462-491)
|
||||
|
||||
```typescript
|
||||
private async markMessagesProcessed(session: ActiveSession, worker: any): Promise<void> {
|
||||
const pendingMessageStore = this.sessionManager.getPendingMessageStore();
|
||||
|
||||
if (session.pendingProcessingIds.size > 0) {
|
||||
for (const messageId of session.pendingProcessingIds) {
|
||||
pendingMessageStore.markProcessed(messageId);
|
||||
}
|
||||
session.pendingProcessingIds.clear();
|
||||
session.earliestPendingTimestamp = null;
|
||||
|
||||
// Cleanup old processed messages
|
||||
const deletedCount = pendingMessageStore.cleanupProcessed(100);
|
||||
}
|
||||
|
||||
// Broadcast status update
|
||||
if (worker && typeof worker.broadcastProcessingStatus === 'function') {
|
||||
worker.broadcastProcessingStatus();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Loop over `pendingProcessingIds` | Mark ALL messages that were yielded to SDK | Batch completion |
|
||||
| `markProcessed()` for each | Transition processing->processed in database | Completes the message lifecycle |
|
||||
| `.clear()` | Reset tracking set for next batch | Prepare for next iteration |
|
||||
| `earliestPendingTimestamp = null` | Reset timestamp tracking | Next batch gets fresh timestamps |
|
||||
| `cleanupProcessed(100)` | Don't keep infinite processed messages | Retention policy |
|
||||
| `broadcastProcessingStatus()` | Update UI with new state | SSE broadcast |
|
||||
|
||||
---
|
||||
|
||||
## SessionRoutes (HTTP Entry Points)
|
||||
|
||||
### `startGeneratorWithProvider()` (Lines 118-189)
|
||||
|
||||
```typescript
|
||||
private startGeneratorWithProvider(session, provider, source): void {
|
||||
session.currentProvider = provider;
|
||||
|
||||
session.generatorPromise = agent.startSession(session, this.workerService)
|
||||
.catch(error => {
|
||||
// Mark all processing messages as failed
|
||||
const processingMessages = stmt.all(session.sessionDbId);
|
||||
for (const msg of processingMessages) {
|
||||
pendingStore.markFailed(msg.id);
|
||||
}
|
||||
})
|
||||
.finally(() => {
|
||||
session.generatorPromise = null;
|
||||
session.currentProvider = null;
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Check if there's more work pending
|
||||
const pendingCount = pendingStore.getPendingCount(sessionDbId);
|
||||
if (pendingCount > 0) {
|
||||
// Auto-restart
|
||||
setTimeout(() => {
|
||||
if (stillExists && !stillExists.generatorPromise) {
|
||||
this.startGeneratorWithProvider(stillExists, this.getSelectedProvider(), 'auto-restart');
|
||||
}
|
||||
}, 0);
|
||||
} else {
|
||||
// Cleanup
|
||||
this.sessionManager.deleteSession(sessionDbId);
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| `session.generatorPromise =` | Track that generator is running | Prevents multiple generators per session |
|
||||
| `.catch()` with markFailed | If generator crashes, don't lose messages | Marks for retry or permanent failure |
|
||||
| `.finally()` | Always cleanup regardless of success/failure | Guaranteed cleanup |
|
||||
| `generatorPromise = null` | Allow new generator to start | Clears the "running" flag |
|
||||
| `getPendingCount() > 0` | **CRITICAL**: Check if more work arrived while processing | Handles messages queued during SDK call |
|
||||
| `setTimeout(..., 0)` | Don't restart synchronously (could cause stack issues) | Deferred restart |
|
||||
| `deleteSession()` when no work | Clean up resources | Memory management |
|
||||
|
||||
### `ensureGeneratorRunning()` (Lines 90-113)
|
||||
|
||||
```typescript
|
||||
private ensureGeneratorRunning(sessionDbId: number, source: string): void {
|
||||
const session = this.sessionManager.getSession(sessionDbId);
|
||||
if (!session) return;
|
||||
|
||||
const selectedProvider = this.getSelectedProvider();
|
||||
|
||||
// Start generator if not running
|
||||
if (!session.generatorPromise) {
|
||||
this.startGeneratorWithProvider(session, selectedProvider, source);
|
||||
return;
|
||||
}
|
||||
|
||||
// Generator is running - check if provider changed
|
||||
if (session.currentProvider && session.currentProvider !== selectedProvider) {
|
||||
// Let current generator finish, next one will use new provider
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Check `!generatorPromise` | Only start if not already running | Prevents duplicate generators |
|
||||
| Start generator if not running | Ensure messages get processed | Lazy generator startup |
|
||||
| Provider change detection | Allow switching providers mid-session | Graceful provider transition |
|
||||
|
||||
---
|
||||
|
||||
## WorkerService (Orchestrator)
|
||||
|
||||
### `initializeBackground()` Stuck Message Recovery (Lines 627-633)
|
||||
|
||||
```typescript
|
||||
// Recover stuck messages from previous crashes
|
||||
const STUCK_THRESHOLD_MS = 5 * 60 * 1000; // 5 minutes
|
||||
const resetCount = pendingStore.resetStuckMessages(STUCK_THRESHOLD_MS);
|
||||
if (resetCount > 0) {
|
||||
logger.info('SYSTEM', `Recovered ${resetCount} stuck messages from previous session`);
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Called at startup | Worker may have crashed while messages were processing | Recovery mechanism |
|
||||
| 5 minute threshold | If processing >5min, something went wrong | Reasonable timeout for SDK calls |
|
||||
| Reset to pending | Give stuck messages another chance | Automatic retry |
|
||||
|
||||
### `processPendingQueues()` (Lines 747-811)
|
||||
|
||||
```typescript
|
||||
async processPendingQueues(sessionLimit: number = 10): Promise<Result> {
|
||||
const orphanedSessionIds = pendingStore.getSessionsWithPendingMessages();
|
||||
|
||||
for (const sessionDbId of orphanedSessionIds) {
|
||||
// Skip if session already has active generator
|
||||
const existingSession = this.sessionManager.getSession(sessionDbId);
|
||||
if (existingSession?.generatorPromise) {
|
||||
result.sessionsSkipped++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// Initialize session and start SDK agent
|
||||
const session = this.sessionManager.initializeSession(sessionDbId);
|
||||
this.startSessionWithAutoRestart(session, getPendingCount, 'startup-recovery');
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Called at startup | Resume work interrupted by crash/restart | Auto-recovery |
|
||||
| `getSessionsWithPendingMessages()` | Find sessions that have orphaned work | Database query |
|
||||
| Skip if generator running | Don't start duplicate processors | Race condition prevention |
|
||||
| `startSessionWithAutoRestart()` | Start processing with auto-restart logic | Shares code with SessionRoutes |
|
||||
|
||||
### `startSessionWithAutoRestart()` (Lines 696-739)
|
||||
|
||||
```typescript
|
||||
private startSessionWithAutoRestart(session, getPendingCount, source): void {
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this)
|
||||
.catch(error => { ... })
|
||||
.finally(() => {
|
||||
session.generatorPromise = null;
|
||||
this.broadcastProcessingStatus();
|
||||
|
||||
const stillPending = getPendingCount(sid);
|
||||
if (stillPending > 0) {
|
||||
// Recursive restart
|
||||
setTimeout(() => {
|
||||
const stillExists = this.sessionManager.getSession(sid);
|
||||
if (stillExists && !stillExists.generatorPromise) {
|
||||
this.startSessionWithAutoRestart(stillExists, getPendingCount, 'auto-restart');
|
||||
}
|
||||
}, 0);
|
||||
} else {
|
||||
// Cleanup
|
||||
this.sessionManager.deleteSession(sid);
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
| Line | The Reason Behind This | What It Actually Does |
|
||||
|------|------------------------|----------------------|
|
||||
| Same pattern as SessionRoutes | **DRY**: Shared auto-restart logic | Prevents code duplication |
|
||||
| Recursive restart | Keep processing until queue is empty | Ensures all messages processed |
|
||||
| Check `stillExists` before restart | Session might have been deleted | Safety check |
|
||||
|
||||
---
|
||||
|
||||
## Critical Flow: How a Message Gets Stuck in "Processing"
|
||||
|
||||
### The Problem
|
||||
|
||||
Messages can get stuck in `status = 'processing'` if:
|
||||
|
||||
1. **SDK call hangs indefinitely** - The Agent SDK query never returns
|
||||
2. **Worker crashes mid-processing** - Process dies before markProcessed()
|
||||
3. **Exception in processSDKResponse()** - Error prevents markProcessed() from running
|
||||
|
||||
### The Flow
|
||||
|
||||
```
|
||||
1. queueObservation() called
|
||||
└─► enqueue() → status = 'pending'
|
||||
|
||||
2. getMessageIterator() picks up message
|
||||
└─► markProcessing() → status = 'processing' ✓
|
||||
└─► pendingProcessingIds.add(id)
|
||||
└─► yield message to SDK
|
||||
|
||||
3. SDK processes and returns response
|
||||
└─► processSDKResponse() called
|
||||
└─► Parse observations/summaries
|
||||
└─► Store to database
|
||||
└─► markMessagesProcessed()
|
||||
└─► markProcessed() → status = 'processed' ✓
|
||||
|
||||
IF STEP 3 FAILS OR HANGS:
|
||||
└─► Message stays in 'processing' forever
|
||||
└─► Recovery: resetStuckMessages() after 5 minutes
|
||||
```
|
||||
|
||||
### Why Processing Messages Can Get "Lost"
|
||||
|
||||
**Race Condition in getMessageIterator():**
|
||||
|
||||
```typescript
|
||||
// Lines 445-446 in SessionManager
|
||||
this.getPendingStore().markProcessing(persistentMessage.id);
|
||||
session.pendingProcessingIds.add(persistentMessage.id);
|
||||
```
|
||||
|
||||
The message is marked as `processing` BEFORE being yielded. If the SDK hangs or crashes AFTER this line but BEFORE processSDKResponse completes, the message is stuck.
|
||||
|
||||
**Protection Mechanisms:**
|
||||
|
||||
1. `pendingProcessingIds` tracks what's in-flight
|
||||
2. `markFailed()` in catch handler marks for retry
|
||||
3. `resetStuckMessages()` at startup cleans up old stuck messages
|
||||
|
||||
---
|
||||
|
||||
## Recovery Mechanisms
|
||||
|
||||
### 1. Startup Recovery (Worker crashes)
|
||||
|
||||
```typescript
|
||||
// In initializeBackground()
|
||||
const resetCount = pendingStore.resetStuckMessages(STUCK_THRESHOLD_MS);
|
||||
```
|
||||
|
||||
- Runs when worker starts
|
||||
- Finds messages stuck in `processing` for >5 minutes
|
||||
- Resets them to `pending` for retry
|
||||
|
||||
### 2. Generator Error Recovery
|
||||
|
||||
```typescript
|
||||
// In startGeneratorWithProvider() catch handler
|
||||
for (const msg of processingMessages) {
|
||||
pendingStore.markFailed(msg.id);
|
||||
}
|
||||
```
|
||||
|
||||
- Runs when SDK call throws
|
||||
- Marks processing messages as failed (which may reset to pending if retries remain)
|
||||
|
||||
### 3. Auto-Restart Recovery
|
||||
|
||||
```typescript
|
||||
// In startGeneratorWithProvider() finally handler
|
||||
if (pendingCount > 0) {
|
||||
setTimeout(() => startGeneratorWithProvider(...), 0);
|
||||
}
|
||||
```
|
||||
|
||||
- Runs after every generator completes
|
||||
- Checks for pending work
|
||||
- Starts new generator if work remains
|
||||
|
||||
### 4. Manual Recovery (UI)
|
||||
|
||||
```typescript
|
||||
// PendingMessageStore methods
|
||||
retryMessage(messageId) // Reset specific message to pending
|
||||
retryAllStuck(thresholdMs) // Reset all stuck messages
|
||||
abortMessage(messageId) // Delete message from queue
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary of Potential Issues
|
||||
|
||||
| Issue | Cause | Mitigation |
|
||||
|-------|-------|------------|
|
||||
| Message stuck in processing | SDK hang, crash during processing | `resetStuckMessages()` at startup |
|
||||
| Duplicate processing | Race condition on message claim | `markProcessing()` with `WHERE status = 'pending'` |
|
||||
| Lost messages | Crash before enqueue | DB persist BEFORE in-memory push |
|
||||
| Generator never starts | No call to `ensureGeneratorRunning()` | Called by every HTTP handler |
|
||||
| Generator exits early | Empty queue check race | `finally` handler checks and restarts |
|
||||
| Infinite retry | Repeated failures | `maxRetries` limit (default: 3) |
|
||||
|
||||
---
|
||||
|
||||
## Diagnostic Queries
|
||||
|
||||
Check for stuck messages:
|
||||
```sql
|
||||
SELECT * FROM pending_messages
|
||||
WHERE status = 'processing'
|
||||
AND started_processing_at_epoch < (strftime('%s', 'now') * 1000 - 300000);
|
||||
```
|
||||
|
||||
Check queue depth by session:
|
||||
```sql
|
||||
SELECT session_db_id, status, COUNT(*)
|
||||
FROM pending_messages
|
||||
GROUP BY session_db_id, status;
|
||||
```
|
||||
|
||||
Check retry counts:
|
||||
```sql
|
||||
SELECT id, message_type, retry_count, status
|
||||
FROM pending_messages
|
||||
WHERE retry_count > 0;
|
||||
```
|
||||
File diff suppressed because one or more lines are too long
@@ -3,8 +3,8 @@ import{stdin as L}from"process";import A from"path";import{homedir as K}from"os"
|
||||
${t.stack}`:t.message;if(Array.isArray(t))return`[${t.length} items]`;let r=Object.keys(t);return r.length===0?"{}":r.length<=3?JSON.stringify(t):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(t)}formatTool(t,r){if(!r)return t;let e=typeof r=="string"?JSON.parse(r):r;if(t==="Bash"&&e.command)return`${t}(${e.command})`;if(e.file_path)return`${t}(${e.file_path})`;if(e.notebook_path)return`${t}(${e.notebook_path})`;if(t==="Glob"&&e.pattern)return`${t}(${e.pattern})`;if(t==="Grep"&&e.pattern)return`${t}(${e.pattern})`;if(e.url)return`${t}(${e.url})`;if(e.query)return`${t}(${e.query})`;if(t==="Task"){if(e.subagent_type)return`${t}(${e.subagent_type})`;if(e.description)return`${t}(${e.description})`}return t==="Skill"&&e.skill?`${t}(${e.skill})`:t==="LSP"&&e.operation?`${t}(${e.operation})`:t}formatTimestamp(t){let r=t.getFullYear(),e=String(t.getMonth()+1).padStart(2,"0"),n=String(t.getDate()).padStart(2,"0"),o=String(t.getHours()).padStart(2,"0"),s=String(t.getMinutes()).padStart(2,"0"),E=String(t.getSeconds()).padStart(2,"0"),T=String(t.getMilliseconds()).padStart(3,"0");return`${r}-${e}-${n} ${o}:${s}:${E}.${T}`}log(t,r,e,n,o){if(t<this.getLevel())return;let s=this.formatTimestamp(new Date),E=f[t].padEnd(5),T=r.padEnd(6),l="";n?.correlationId?l=`[${n.correlationId}] `:n?.sessionId&&(l=`[session-${n.sessionId}] `);let c="";o!=null&&(o instanceof Error?c=this.getLevel()===0?`
|
||||
${o.message}
|
||||
${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let O="";if(n){let{sessionId:D,sdkSessionId:Q,correlationId:Z,...m}=n;Object.keys(m).length>0&&(O=` {${Object.entries(m).map(([P,$])=>`${P}=${$}`).join(", ")}}`)}let C=`[${s}] [${E}] [${T}] ${l}${e}${O}${c}`;if(this.logFilePath)try{b(this.logFilePath,C+`
|
||||
`,"utf8")}catch(D){process.stderr.write(`[LOGGER] Failed to write to log file: ${D}
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let O="";if(n){let{sessionId:m,memorySessionId:Q,correlationId:Z,...D}=n;Object.keys(D).length>0&&(O=` {${Object.entries(D).map(([k,$])=>`${k}=${$}`).join(", ")}}`)}let C=`[${s}] [${E}] [${T}] ${l}${e}${O}${c}`;if(this.logFilePath)try{b(this.logFilePath,C+`
|
||||
`,"utf8")}catch(m){process.stderr.write(`[LOGGER] Failed to write to log file: ${m}
|
||||
`)}else process.stderr.write(C+`
|
||||
`)}debug(t,r,e,n){this.log(0,t,r,e,n)}info(t,r,e,n){this.log(1,t,r,e,n)}warn(t,r,e,n){this.log(2,t,r,e,n)}error(t,r,e,n){this.log(3,t,r,e,n)}dataIn(t,r,e,n){this.info(t,`\u2192 ${r}`,e,n)}dataOut(t,r,e,n){this.info(t,`\u2190 ${r}`,e,n)}success(t,r,e,n){this.info(t,`\u2713 ${r}`,e,n)}failure(t,r,e,n){this.error(t,`\u2717 ${r}`,e,n)}timing(t,r,e,n){this.info(t,`\u23F1 ${r}`,n,{duration:`${e}ms`})}happyPathError(t,r,e,n,o=""){let l=((new Error().stack||"").split(`
|
||||
`)[2]||"").match(/at\s+(?:.*\s+)?\(?([^:]+):(\d+):(\d+)\)?/),c=l?`${l[1].split("/").pop()}:${l[2]}`:"unknown",O={...e,location:c};return this.warn(t,`[HAPPY-PATH] ${r}`,O,n),o}},_=new p;var g={DEFAULT:3e5,HEALTH_CHECK:3e4,WORKER_STARTUP_WAIT:1e3,WORKER_STARTUP_RETRIES:300,PRE_RESTART_SETTLE_DELAY:2e3,WINDOWS_MULTIPLIER:1.5};function d(i){return process.platform==="win32"?Math.round(i*g.WINDOWS_MULTIPLIER):i}function h(i={}){let{port:t,includeSkillFallback:r=!1,customPrefix:e,actualError:n}=i,o=e||"Worker service connection failed.",s=t?` (port ${t})`:"",E=`${o}${s}
|
||||
@@ -16,4 +16,4 @@ ${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
|
||||
If that doesn't work, try: /troubleshoot`),n&&(E=`Worker Error: ${n}
|
||||
|
||||
${E}`),E}var j=A.join(K(),".claude","plugins","marketplaces","thedotmack"),I=d(g.HEALTH_CHECK),M=null;function u(){if(M!==null)return M;let i=A.join(a.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=a.loadFromFile(i);return M=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),M}async function V(){let i=u();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(I)})).ok}function B(){let i=A.join(j,"package.json");return JSON.parse(X(i,"utf-8")).version}async function Y(){let i=u(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(I)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let i=B(),t=await Y();i!==t&&_.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function N(){for(let r=0;r<25;r++){try{if(await V()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(h({port:u(),customPrefix:"Worker did not become ready within 5 seconds."}))}import z from"path";function k(i){if(!i||i.trim()==="")return _.warn("PROJECT_NAME","Empty cwd provided, using fallback",{cwd:i}),"unknown-project";let t=z.basename(i);if(t===""){if(process.platform==="win32"){let e=i.match(/^([A-Z]):\\/i);if(e){let o=`drive-${e[1].toUpperCase()}`;return _.info("PROJECT_NAME","Drive root detected",{cwd:i,projectName:o}),o}}return _.warn("PROJECT_NAME","Root directory detected, using fallback",{cwd:i}),"unknown-project"}return t}async function y(i){await N();let t=i?.cwd??process.cwd(),r=k(t),n=`http://127.0.0.1:${u()}/api/context/inject?project=${encodeURIComponent(r)}`,o=await fetch(n,{signal:AbortSignal.timeout(g.DEFAULT)});if(!o.ok)throw new Error(`Context generation failed: ${o.status}`);return(await o.text()).trim()}var q=process.argv.includes("--colors");if(L.isTTY||q)y(void 0).then(i=>{console.log(i),process.exit(0)});else{let i="";L.on("data",t=>i+=t),L.on("end",async()=>{let t;try{t=i.trim()?JSON.parse(i):void 0}catch(e){throw new Error(`Failed to parse hook input: ${e instanceof Error?e.message:String(e)}`)}let r=await y(t);console.log(JSON.stringify({hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:r}})),process.exit(0)})}
|
||||
${E}`),E}var j=A.join(K(),".claude","plugins","marketplaces","thedotmack"),I=d(g.HEALTH_CHECK),M=null;function u(){if(M!==null)return M;let i=A.join(a.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=a.loadFromFile(i);return M=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),M}async function V(){let i=u();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(I)})).ok}function B(){let i=A.join(j,"package.json");return JSON.parse(X(i,"utf-8")).version}async function Y(){let i=u(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(I)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let i=B(),t=await Y();i!==t&&_.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function N(){for(let r=0;r<25;r++){try{if(await V()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(h({port:u(),customPrefix:"Worker did not become ready within 5 seconds."}))}import z from"path";function y(i){if(!i||i.trim()==="")return _.warn("PROJECT_NAME","Empty cwd provided, using fallback",{cwd:i}),"unknown-project";let t=z.basename(i);if(t===""){if(process.platform==="win32"){let e=i.match(/^([A-Z]):\\/i);if(e){let o=`drive-${e[1].toUpperCase()}`;return _.info("PROJECT_NAME","Drive root detected",{cwd:i,projectName:o}),o}}return _.warn("PROJECT_NAME","Root directory detected, using fallback",{cwd:i}),"unknown-project"}return t}async function P(i){await N();let t=i?.cwd??process.cwd(),r=y(t),n=`http://127.0.0.1:${u()}/api/context/inject?project=${encodeURIComponent(r)}`,o=await fetch(n,{signal:AbortSignal.timeout(g.DEFAULT)});if(!o.ok)throw new Error(`Context generation failed: ${o.status}`);return(await o.text()).trim()}var q=process.argv.includes("--colors");if(L.isTTY||q)P(void 0).then(i=>{console.log(i),process.exit(0)});else{let i="";L.on("data",t=>i+=t),L.on("end",async()=>{let t;try{t=i.trim()?JSON.parse(i):void 0}catch(e){throw new Error(`Failed to parse hook input: ${e instanceof Error?e.message:String(e)}`)}let r=await P(t);console.log(JSON.stringify({hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:r}})),process.exit(0)})}
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Object.keys(e);return t.length===0?"{}":t.length<=3?JSON.stringify(e):`{${t.length} keys: ${t.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,t){if(!t)return e;let s=typeof t=="string"?JSON.parse(t):t;if(e==="Bash"&&s.command)return`${e}(${s.command})`;if(s.file_path)return`${e}(${s.file_path})`;if(s.notebook_path)return`${e}(${s.notebook_path})`;if(e==="Glob"&&s.pattern)return`${e}(${s.pattern})`;if(e==="Grep"&&s.pattern)return`${e}(${s.pattern})`;if(s.url)return`${e}(${s.url})`;if(s.query)return`${e}(${s.query})`;if(e==="Task"){if(s.subagent_type)return`${e}(${s.subagent_type})`;if(s.description)return`${e}(${s.description})`}return e==="Skill"&&s.skill?`${e}(${s.skill})`:e==="LSP"&&s.operation?`${e}(${s.operation})`:e}formatTimestamp(e){let t=e.getFullYear(),s=String(e.getMonth()+1).padStart(2,"0"),r=String(e.getDate()).padStart(2,"0"),n=String(e.getHours()).padStart(2,"0"),l=String(e.getMinutes()).padStart(2,"0"),i=String(e.getSeconds()).padStart(2,"0"),f=String(e.getMilliseconds()).padStart(3,"0");return`${t}-${s}-${r} ${n}:${l}:${i}.${f}`}log(e,t,s,r,n){if(e<this.getLevel())return;let l=this.formatTimestamp(new Date),i=Ea[e].padEnd(5),f=t.padEnd(6),d="";r?.correlationId?d=`[${r.correlationId}] `:r?.sessionId&&(d=`[session-${r.sessionId}] `);let h="";n!=null&&(n instanceof Error?h=this.getLevel()===0?`
|
||||
${n.message}
|
||||
${n.stack}`:` ${n.message}`:this.getLevel()===0&&typeof n=="object"?h=`
|
||||
`+JSON.stringify(n,null,2):h=" "+this.formatData(n));let m="";if(r){let{sessionId:c,sdkSessionId:y,correlationId:_,...v}=r;Object.keys(v).length>0&&(m=` {${Object.entries(v).map(([O,x])=>`${O}=${x}`).join(", ")}}`)}let E=`[${l}] [${i}] [${f}] ${d}${s}${m}${h}`;if(this.logFilePath)try{(0,Yr.appendFileSync)(this.logFilePath,E+`
|
||||
`+JSON.stringify(n,null,2):h=" "+this.formatData(n));let m="";if(r){let{sessionId:c,memorySessionId:y,correlationId:_,...v}=r;Object.keys(v).length>0&&(m=` {${Object.entries(v).map(([O,x])=>`${O}=${x}`).join(", ")}}`)}let E=`[${l}] [${i}] [${f}] ${d}${s}${m}${h}`;if(this.logFilePath)try{(0,Yr.appendFileSync)(this.logFilePath,E+`
|
||||
`,"utf8")}catch(c){process.stderr.write(`[LOGGER] Failed to write to log file: ${c}
|
||||
`)}else process.stderr.write(E+`
|
||||
`)}debug(e,t,s,r){this.log(0,e,t,s,r)}info(e,t,s,r){this.log(1,e,t,s,r)}warn(e,t,s,r){this.log(2,e,t,s,r)}error(e,t,s,r){this.log(3,e,t,s,r)}dataIn(e,t,s,r){this.info(e,`\u2192 ${t}`,s,r)}dataOut(e,t,s,r){this.info(e,`\u2190 ${t}`,s,r)}success(e,t,s,r){this.info(e,`\u2713 ${t}`,s,r)}failure(e,t,s,r){this.error(e,`\u2717 ${t}`,s,r)}timing(e,t,s,r){this.info(e,`\u23F1 ${t}`,r,{duration:`${s}ms`})}happyPathError(e,t,s,r,n=""){let d=((new Error().stack||"").split(`
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
#!/usr/bin/env bun
|
||||
import{stdin as y}from"process";var S=JSON.stringify({continue:!0,suppressOutput:!0});import L from"path";import{homedir as X}from"os";import{readFileSync as j}from"fs";import{readFileSync as w,writeFileSync as b,existsSync as F}from"fs";import{join as H}from"path";import{homedir as W}from"os";var R="bugfix,feature,refactor,discovery,decision,change",U="how-it-works,why-it-exists,what-changed,problem-solution,gotcha,pattern,trade-off";var g=class{static DEFAULTS={CLAUDE_MEM_MODEL:"claude-sonnet-4-5",CLAUDE_MEM_CONTEXT_OBSERVATIONS:"50",CLAUDE_MEM_WORKER_PORT:"37777",CLAUDE_MEM_WORKER_HOST:"127.0.0.1",CLAUDE_MEM_SKIP_TOOLS:"ListMcpResourcesTool,SlashCommand,Skill,TodoWrite,AskUserQuestion",CLAUDE_MEM_PROVIDER:"claude",CLAUDE_MEM_GEMINI_API_KEY:"",CLAUDE_MEM_GEMINI_MODEL:"gemini-2.5-flash-lite",CLAUDE_MEM_GEMINI_RATE_LIMITING_ENABLED:"true",CLAUDE_MEM_OPENROUTER_API_KEY:"",CLAUDE_MEM_OPENROUTER_MODEL:"xiaomi/mimo-v2-flash:free",CLAUDE_MEM_OPENROUTER_SITE_URL:"",CLAUDE_MEM_OPENROUTER_APP_NAME:"claude-mem",CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_OPENROUTER_MAX_TOKENS:"100000",CLAUDE_MEM_DATA_DIR:H(W(),".claude-mem"),CLAUDE_MEM_LOG_LEVEL:"INFO",CLAUDE_MEM_PYTHON_VERSION:"3.13",CLAUDE_CODE_PATH:"",CLAUDE_MEM_MODE:"code",CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT:"true",CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES:R,CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS:U,CLAUDE_MEM_CONTEXT_FULL_COUNT:"5",CLAUDE_MEM_CONTEXT_FULL_FIELD:"narrative",CLAUDE_MEM_CONTEXT_SESSION_COUNT:"10",CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY:"true",CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE:"false"};static getAllDefaults(){return{...this.DEFAULTS}}static get(t){return this.DEFAULTS[t]}static getInt(t){let r=this.get(t);return parseInt(r,10)}static getBool(t){return this.get(t)==="true"}static loadFromFile(t){try{if(!F(t))return this.getAllDefaults();let r=w(t,"utf-8"),e=JSON.parse(r),n=e;if(e.env&&typeof e.env=="object"){n=e.env;try{b(t,JSON.stringify(n,null,2),"utf-8"),E.info("SETTINGS","Migrated settings file from nested to flat schema",{settingsPath:t})}catch(a){E.warn("SETTINGS","Failed to auto-migrate settings file",{settingsPath:t},a)}}let o={...this.DEFAULTS};for(let a of Object.keys(this.DEFAULTS))n[a]!==void 0&&(o[a]=n[a]);return o}catch(r){return E.warn("SETTINGS","Failed to load settings, using defaults",{settingsPath:t},r),this.getAllDefaults()}}};import{appendFileSync as K,existsSync as x,mkdirSync as G}from"fs";import{join as T}from"path";var f=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(f||{}),M=class{level=null;useColor;logFilePath=null;constructor(){this.useColor=process.stdout.isTTY??!1,this.initializeLogFile()}initializeLogFile(){try{let t=g.get("CLAUDE_MEM_DATA_DIR"),r=T(t,"logs");x(r)||G(r,{recursive:!0});let e=new Date().toISOString().split("T")[0];this.logFilePath=T(r,`claude-mem-${e}.log`)}catch(t){console.error("[LOGGER] Failed to initialize log file:",t),this.logFilePath=null}}getLevel(){if(this.level===null)try{let t=g.get("CLAUDE_MEM_DATA_DIR"),r=T(t,"settings.json"),n=g.loadFromFile(r).CLAUDE_MEM_LOG_LEVEL.toUpperCase();this.level=f[n]??1}catch(t){console.error("[LOGGER] Failed to load settings, using INFO level:",t),this.level=1}return this.level}correlationId(t,r){return`obs-${t}-${r}`}sessionId(t){return`session-${t}`}formatData(t){if(t==null)return"";if(typeof t=="string")return t;if(typeof t=="number"||typeof t=="boolean")return t.toString();if(typeof t=="object"){if(t instanceof Error)return this.getLevel()===0?`${t.message}
|
||||
import{stdin as y}from"process";var S=JSON.stringify({continue:!0,suppressOutput:!0});import m from"path";import{homedir as X}from"os";import{readFileSync as j}from"fs";import{readFileSync as w,writeFileSync as b,existsSync as F}from"fs";import{join as H}from"path";import{homedir as W}from"os";var R="bugfix,feature,refactor,discovery,decision,change",U="how-it-works,why-it-exists,what-changed,problem-solution,gotcha,pattern,trade-off";var g=class{static DEFAULTS={CLAUDE_MEM_MODEL:"claude-sonnet-4-5",CLAUDE_MEM_CONTEXT_OBSERVATIONS:"50",CLAUDE_MEM_WORKER_PORT:"37777",CLAUDE_MEM_WORKER_HOST:"127.0.0.1",CLAUDE_MEM_SKIP_TOOLS:"ListMcpResourcesTool,SlashCommand,Skill,TodoWrite,AskUserQuestion",CLAUDE_MEM_PROVIDER:"claude",CLAUDE_MEM_GEMINI_API_KEY:"",CLAUDE_MEM_GEMINI_MODEL:"gemini-2.5-flash-lite",CLAUDE_MEM_GEMINI_RATE_LIMITING_ENABLED:"true",CLAUDE_MEM_OPENROUTER_API_KEY:"",CLAUDE_MEM_OPENROUTER_MODEL:"xiaomi/mimo-v2-flash:free",CLAUDE_MEM_OPENROUTER_SITE_URL:"",CLAUDE_MEM_OPENROUTER_APP_NAME:"claude-mem",CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_OPENROUTER_MAX_TOKENS:"100000",CLAUDE_MEM_DATA_DIR:H(W(),".claude-mem"),CLAUDE_MEM_LOG_LEVEL:"INFO",CLAUDE_MEM_PYTHON_VERSION:"3.13",CLAUDE_CODE_PATH:"",CLAUDE_MEM_MODE:"code",CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT:"true",CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES:R,CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS:U,CLAUDE_MEM_CONTEXT_FULL_COUNT:"5",CLAUDE_MEM_CONTEXT_FULL_FIELD:"narrative",CLAUDE_MEM_CONTEXT_SESSION_COUNT:"10",CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY:"true",CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE:"false"};static getAllDefaults(){return{...this.DEFAULTS}}static get(t){return this.DEFAULTS[t]}static getInt(t){let r=this.get(t);return parseInt(r,10)}static getBool(t){return this.get(t)==="true"}static loadFromFile(t){try{if(!F(t))return this.getAllDefaults();let r=w(t,"utf-8"),e=JSON.parse(r),n=e;if(e.env&&typeof e.env=="object"){n=e.env;try{b(t,JSON.stringify(n,null,2),"utf-8"),E.info("SETTINGS","Migrated settings file from nested to flat schema",{settingsPath:t})}catch(a){E.warn("SETTINGS","Failed to auto-migrate settings file",{settingsPath:t},a)}}let o={...this.DEFAULTS};for(let a of Object.keys(this.DEFAULTS))n[a]!==void 0&&(o[a]=n[a]);return o}catch(r){return E.warn("SETTINGS","Failed to load settings, using defaults",{settingsPath:t},r),this.getAllDefaults()}}};import{appendFileSync as K,existsSync as x,mkdirSync as G}from"fs";import{join as T}from"path";var f=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(f||{}),M=class{level=null;useColor;logFilePath=null;constructor(){this.useColor=process.stdout.isTTY??!1,this.initializeLogFile()}initializeLogFile(){try{let t=g.get("CLAUDE_MEM_DATA_DIR"),r=T(t,"logs");x(r)||G(r,{recursive:!0});let e=new Date().toISOString().split("T")[0];this.logFilePath=T(r,`claude-mem-${e}.log`)}catch(t){console.error("[LOGGER] Failed to initialize log file:",t),this.logFilePath=null}}getLevel(){if(this.level===null)try{let t=g.get("CLAUDE_MEM_DATA_DIR"),r=T(t,"settings.json"),n=g.loadFromFile(r).CLAUDE_MEM_LOG_LEVEL.toUpperCase();this.level=f[n]??1}catch(t){console.error("[LOGGER] Failed to load settings, using INFO level:",t),this.level=1}return this.level}correlationId(t,r){return`obs-${t}-${r}`}sessionId(t){return`session-${t}`}formatData(t){if(t==null)return"";if(typeof t=="string")return t;if(typeof t=="number"||typeof t=="boolean")return t.toString();if(typeof t=="object"){if(t instanceof Error)return this.getLevel()===0?`${t.message}
|
||||
${t.stack}`:t.message;if(Array.isArray(t))return`[${t.length} items]`;let r=Object.keys(t);return r.length===0?"{}":r.length<=3?JSON.stringify(t):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(t)}formatTool(t,r){if(!r)return t;let e=typeof r=="string"?JSON.parse(r):r;if(t==="Bash"&&e.command)return`${t}(${e.command})`;if(e.file_path)return`${t}(${e.file_path})`;if(e.notebook_path)return`${t}(${e.notebook_path})`;if(t==="Glob"&&e.pattern)return`${t}(${e.pattern})`;if(t==="Grep"&&e.pattern)return`${t}(${e.pattern})`;if(e.url)return`${t}(${e.url})`;if(e.query)return`${t}(${e.query})`;if(t==="Task"){if(e.subagent_type)return`${t}(${e.subagent_type})`;if(e.description)return`${t}(${e.description})`}return t==="Skill"&&e.skill?`${t}(${e.skill})`:t==="LSP"&&e.operation?`${t}(${e.operation})`:t}formatTimestamp(t){let r=t.getFullYear(),e=String(t.getMonth()+1).padStart(2,"0"),n=String(t.getDate()).padStart(2,"0"),o=String(t.getHours()).padStart(2,"0"),a=String(t.getMinutes()).padStart(2,"0"),s=String(t.getSeconds()).padStart(2,"0"),l=String(t.getMilliseconds()).padStart(3,"0");return`${r}-${e}-${n} ${o}:${a}:${s}.${l}`}log(t,r,e,n,o){if(t<this.getLevel())return;let a=this.formatTimestamp(new Date),s=f[t].padEnd(5),l=r.padEnd(6),_="";n?.correlationId?_=`[${n.correlationId}] `:n?.sessionId&&(_=`[session-${n.sessionId}] `);let c="";o!=null&&(o instanceof Error?c=this.getLevel()===0?`
|
||||
${o.message}
|
||||
${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let u="";if(n){let{sessionId:D,sdkSessionId:Z,correlationId:tt,...d}=n;Object.keys(d).length>0&&(u=` {${Object.entries(d).map(([$,v])=>`${$}=${v}`).join(", ")}}`)}let m=`[${a}] [${s}] [${l}] ${_}${e}${u}${c}`;if(this.logFilePath)try{K(this.logFilePath,m+`
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let u="";if(n){let{sessionId:D,memorySessionId:Z,correlationId:tt,...d}=n;Object.keys(d).length>0&&(u=` {${Object.entries(d).map(([$,v])=>`${$}=${v}`).join(", ")}}`)}let C=`[${a}] [${s}] [${l}] ${_}${e}${u}${c}`;if(this.logFilePath)try{K(this.logFilePath,C+`
|
||||
`,"utf8")}catch(D){process.stderr.write(`[LOGGER] Failed to write to log file: ${D}
|
||||
`)}else process.stderr.write(m+`
|
||||
`)}else process.stderr.write(C+`
|
||||
`)}debug(t,r,e,n){this.log(0,t,r,e,n)}info(t,r,e,n){this.log(1,t,r,e,n)}warn(t,r,e,n){this.log(2,t,r,e,n)}error(t,r,e,n){this.log(3,t,r,e,n)}dataIn(t,r,e,n){this.info(t,`\u2192 ${r}`,e,n)}dataOut(t,r,e,n){this.info(t,`\u2190 ${r}`,e,n)}success(t,r,e,n){this.info(t,`\u2713 ${r}`,e,n)}failure(t,r,e,n){this.error(t,`\u2717 ${r}`,e,n)}timing(t,r,e,n){this.info(t,`\u23F1 ${r}`,n,{duration:`${e}ms`})}happyPathError(t,r,e,n,o=""){let _=((new Error().stack||"").split(`
|
||||
`)[2]||"").match(/at\s+(?:.*\s+)?\(?([^:]+):(\d+):(\d+)\)?/),c=_?`${_[1].split("/").pop()}:${_[2]}`:"unknown",u={...e,location:c};return this.warn(t,`[HAPPY-PATH] ${r}`,u,n),o}},E=new M;var A={DEFAULT:3e5,HEALTH_CHECK:3e4,WORKER_STARTUP_WAIT:1e3,WORKER_STARTUP_RETRIES:300,PRE_RESTART_SETTLE_DELAY:2e3,WINDOWS_MULTIPLIER:1.5};function h(i){return process.platform==="win32"?Math.round(i*A.WINDOWS_MULTIPLIER):i}function N(i={}){let{port:t,includeSkillFallback:r=!1,customPrefix:e,actualError:n}=i,o=e||"Worker service connection failed.",a=t?` (port ${t})`:"",s=`${o}${a}
|
||||
|
||||
@@ -16,4 +16,4 @@ ${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
|
||||
If that doesn't work, try: /troubleshoot`),n&&(s=`Worker Error: ${n}
|
||||
|
||||
${s}`),s}var V=L.join(X(),".claude","plugins","marketplaces","thedotmack"),I=h(A.HEALTH_CHECK),O=null;function p(){if(O!==null)return O;let i=L.join(g.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=g.loadFromFile(i);return O=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),O}async function B(){let i=p();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(I)})).ok}function Y(){let i=L.join(V,"package.json");return JSON.parse(j(i,"utf-8")).version}async function J(){let i=p(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(I)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function z(){let i=Y(),t=await J();i!==t&&E.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function k(){for(let r=0;r<25;r++){try{if(await B()){await z();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(N({port:p(),customPrefix:"Worker did not become ready within 5 seconds."}))}import q from"path";function P(i){if(!i||i.trim()==="")return E.warn("PROJECT_NAME","Empty cwd provided, using fallback",{cwd:i}),"unknown-project";let t=q.basename(i);if(t===""){if(process.platform==="win32"){let e=i.match(/^([A-Z]):\\/i);if(e){let o=`drive-${e[1].toUpperCase()}`;return E.info("PROJECT_NAME","Drive root detected",{cwd:i,projectName:o}),o}}return E.warn("PROJECT_NAME","Root directory detected, using fallback",{cwd:i}),"unknown-project"}return t}async function Q(i){if(await k(),!i)throw new Error("newHook requires input");let{session_id:t,cwd:r,prompt:e}=i,n=P(r);E.info("HOOK","new-hook: Received hook input",{session_id:t,has_prompt:!!e,cwd:r});let o=p();E.info("HOOK","new-hook: Calling /api/sessions/init",{claudeSessionId:t,project:n,prompt_length:e?.length});let a=await fetch(`http://127.0.0.1:${o}/api/sessions/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({claudeSessionId:t,project:n,prompt:e}),signal:AbortSignal.timeout(5e3)});if(!a.ok)throw new Error(`Session initialization failed: ${a.status}`);let s=await a.json(),l=s.sessionDbId,_=s.promptNumber;if(E.info("HOOK","new-hook: Received from /api/sessions/init",{sessionDbId:l,promptNumber:_,skipped:s.skipped}),s.skipped&&s.reason==="private"){E.info("HOOK",`new-hook: Session ${l}, prompt #${_} (fully private - skipped)`),console.log(S);return}E.info("HOOK",`new-hook: Session ${l}, prompt #${_}`);let c=e.startsWith("/")?e.substring(1):e;E.info("HOOK","new-hook: Calling /sessions/{sessionDbId}/init",{sessionDbId:l,promptNumber:_,userPrompt_length:c?.length});let u=await fetch(`http://127.0.0.1:${o}/sessions/${l}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({userPrompt:c,promptNumber:_}),signal:AbortSignal.timeout(5e3)});if(!u.ok)throw new Error(`SDK agent start failed: ${u.status}`);console.log(S)}var C="";y.on("data",i=>C+=i);y.on("end",async()=>{let i;try{i=C?JSON.parse(C):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await Q(i)});
|
||||
${s}`),s}var V=m.join(X(),".claude","plugins","marketplaces","thedotmack"),I=h(A.HEALTH_CHECK),O=null;function p(){if(O!==null)return O;let i=m.join(g.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=g.loadFromFile(i);return O=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),O}async function B(){let i=p();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(I)})).ok}function Y(){let i=m.join(V,"package.json");return JSON.parse(j(i,"utf-8")).version}async function J(){let i=p(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(I)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function z(){let i=Y(),t=await J();i!==t&&E.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function P(){for(let r=0;r<25;r++){try{if(await B()){await z();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(N({port:p(),customPrefix:"Worker did not become ready within 5 seconds."}))}import q from"path";function k(i){if(!i||i.trim()==="")return E.warn("PROJECT_NAME","Empty cwd provided, using fallback",{cwd:i}),"unknown-project";let t=q.basename(i);if(t===""){if(process.platform==="win32"){let e=i.match(/^([A-Z]):\\/i);if(e){let o=`drive-${e[1].toUpperCase()}`;return E.info("PROJECT_NAME","Drive root detected",{cwd:i,projectName:o}),o}}return E.warn("PROJECT_NAME","Root directory detected, using fallback",{cwd:i}),"unknown-project"}return t}async function Q(i){if(await P(),!i)throw new Error("newHook requires input");let{session_id:t,cwd:r,prompt:e}=i,n=k(r);E.info("HOOK","new-hook: Received hook input",{session_id:t,has_prompt:!!e,cwd:r});let o=p();E.info("HOOK","new-hook: Calling /api/sessions/init",{contentSessionId:t,project:n,prompt_length:e?.length});let a=await fetch(`http://127.0.0.1:${o}/api/sessions/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({contentSessionId:t,project:n,prompt:e}),signal:AbortSignal.timeout(5e3)});if(!a.ok)throw new Error(`Session initialization failed: ${a.status}`);let s=await a.json(),l=s.sessionDbId,_=s.promptNumber;if(E.info("HOOK","new-hook: Received from /api/sessions/init",{sessionDbId:l,promptNumber:_,skipped:s.skipped}),s.skipped&&s.reason==="private"){E.info("HOOK",`new-hook: Session ${l}, prompt #${_} (fully private - skipped)`),console.log(S);return}E.info("HOOK",`new-hook: Session ${l}, prompt #${_}`);let c=e.startsWith("/")?e.substring(1):e;E.info("HOOK","new-hook: Calling /sessions/{sessionDbId}/init",{sessionDbId:l,promptNumber:_,userPrompt_length:c?.length});let u=await fetch(`http://127.0.0.1:${o}/sessions/${l}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({userPrompt:c,promptNumber:_}),signal:AbortSignal.timeout(5e3)});if(!u.ok)throw new Error(`SDK agent start failed: ${u.status}`);console.log(S)}var L="";y.on("data",i=>L+=i);y.on("end",async()=>{let i;try{i=L?JSON.parse(L):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await Q(i)});
|
||||
|
||||
@@ -3,7 +3,7 @@ import{stdin as P}from"process";var U=JSON.stringify({continue:!0,suppressOutput
|
||||
${t.stack}`:t.message;if(Array.isArray(t))return`[${t.length} items]`;let r=Object.keys(t);return r.length===0?"{}":r.length<=3?JSON.stringify(t):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(t)}formatTool(t,r){if(!r)return t;let e=typeof r=="string"?JSON.parse(r):r;if(t==="Bash"&&e.command)return`${t}(${e.command})`;if(e.file_path)return`${t}(${e.file_path})`;if(e.notebook_path)return`${t}(${e.notebook_path})`;if(t==="Glob"&&e.pattern)return`${t}(${e.pattern})`;if(t==="Grep"&&e.pattern)return`${t}(${e.pattern})`;if(e.url)return`${t}(${e.url})`;if(e.query)return`${t}(${e.query})`;if(t==="Task"){if(e.subagent_type)return`${t}(${e.subagent_type})`;if(e.description)return`${t}(${e.description})`}return t==="Skill"&&e.skill?`${t}(${e.skill})`:t==="LSP"&&e.operation?`${t}(${e.operation})`:t}formatTimestamp(t){let r=t.getFullYear(),e=String(t.getMonth()+1).padStart(2,"0"),n=String(t.getDate()).padStart(2,"0"),o=String(t.getHours()).padStart(2,"0"),i=String(t.getMinutes()).padStart(2,"0"),E=String(t.getSeconds()).padStart(2,"0"),l=String(t.getMilliseconds()).padStart(3,"0");return`${r}-${e}-${n} ${o}:${i}:${E}.${l}`}log(t,r,e,n,o){if(t<this.getLevel())return;let i=this.formatTimestamp(new Date),E=f[t].padEnd(5),l=r.padEnd(6),g="";n?.correlationId?g=`[${n.correlationId}] `:n?.sessionId&&(g=`[session-${n.sessionId}] `);let c="";o!=null&&(o instanceof Error?c=this.getLevel()===0?`
|
||||
${o.message}
|
||||
${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let T="";if(n){let{sessionId:D,sdkSessionId:z,correlationId:Q,...m}=n;Object.keys(m).length>0&&(T=` {${Object.entries(m).map(([$,k])=>`${$}=${k}`).join(", ")}}`)}let C=`[${i}] [${E}] [${l}] ${g}${e}${T}${c}`;if(this.logFilePath)try{b(this.logFilePath,C+`
|
||||
`+JSON.stringify(o,null,2):c=" "+this.formatData(o));let T="";if(n){let{sessionId:D,memorySessionId:z,correlationId:Q,...m}=n;Object.keys(m).length>0&&(T=` {${Object.entries(m).map(([$,k])=>`${$}=${k}`).join(", ")}}`)}let C=`[${i}] [${E}] [${l}] ${g}${e}${T}${c}`;if(this.logFilePath)try{b(this.logFilePath,C+`
|
||||
`,"utf8")}catch(D){process.stderr.write(`[LOGGER] Failed to write to log file: ${D}
|
||||
`)}else process.stderr.write(C+`
|
||||
`)}debug(t,r,e,n){this.log(0,t,r,e,n)}info(t,r,e,n){this.log(1,t,r,e,n)}warn(t,r,e,n){this.log(2,t,r,e,n)}error(t,r,e,n){this.log(3,t,r,e,n)}dataIn(t,r,e,n){this.info(t,`\u2192 ${r}`,e,n)}dataOut(t,r,e,n){this.info(t,`\u2190 ${r}`,e,n)}success(t,r,e,n){this.info(t,`\u2713 ${r}`,e,n)}failure(t,r,e,n){this.error(t,`\u2717 ${r}`,e,n)}timing(t,r,e,n){this.info(t,`\u23F1 ${r}`,n,{duration:`${e}ms`})}happyPathError(t,r,e,n,o=""){let g=((new Error().stack||"").split(`
|
||||
@@ -16,4 +16,4 @@ ${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?c=`
|
||||
|
||||
If that doesn't work, try: /troubleshoot`),n&&(E=`Worker Error: ${n}
|
||||
|
||||
${E}`),E}var V=A.join(K(),".claude","plugins","marketplaces","thedotmack"),N=I(u.HEALTH_CHECK),S=null;function O(){if(S!==null)return S;let s=A.join(a.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=a.loadFromFile(s);return S=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),S}async function j(){let s=O();return(await fetch(`http://127.0.0.1:${s}/api/readiness`,{signal:AbortSignal.timeout(N)})).ok}function B(){let s=A.join(V,"package.json");return JSON.parse(X(s,"utf-8")).version}async function Y(){let s=O(),t=await fetch(`http://127.0.0.1:${s}/api/version`,{signal:AbortSignal.timeout(N)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let s=B(),t=await Y();s!==t&&_.warn("SYSTEM","Worker version mismatch",{pluginVersion:s,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function y(){for(let r=0;r<25;r++){try{if(await j()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(h({port:O(),customPrefix:"Worker did not become ready within 5 seconds."}))}async function q(s){if(await y(),!s)throw new Error("saveHook requires input");let{session_id:t,cwd:r,tool_name:e,tool_input:n,tool_response:o}=s,i=O(),E=_.formatTool(e,n);if(_.dataIn("HOOK",`PostToolUse: ${E}`,{workerPort:i}),!r)throw new Error(`Missing cwd in PostToolUse hook input for session ${t}, tool ${e}`);let l=await fetch(`http://127.0.0.1:${i}/api/sessions/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({claudeSessionId:t,tool_name:e,tool_input:n,tool_response:o,cwd:r}),signal:AbortSignal.timeout(u.DEFAULT)});if(!l.ok)throw new Error(`Observation storage failed: ${l.status}`);_.debug("HOOK","Observation sent successfully",{toolName:e}),console.log(U)}var L="";P.on("data",s=>L+=s);P.on("end",async()=>{let s;try{s=L?JSON.parse(L):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await q(s)});
|
||||
${E}`),E}var V=A.join(K(),".claude","plugins","marketplaces","thedotmack"),N=I(u.HEALTH_CHECK),S=null;function O(){if(S!==null)return S;let s=A.join(a.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=a.loadFromFile(s);return S=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),S}async function j(){let s=O();return(await fetch(`http://127.0.0.1:${s}/api/readiness`,{signal:AbortSignal.timeout(N)})).ok}function B(){let s=A.join(V,"package.json");return JSON.parse(X(s,"utf-8")).version}async function Y(){let s=O(),t=await fetch(`http://127.0.0.1:${s}/api/version`,{signal:AbortSignal.timeout(N)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let s=B(),t=await Y();s!==t&&_.warn("SYSTEM","Worker version mismatch",{pluginVersion:s,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function y(){for(let r=0;r<25;r++){try{if(await j()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(h({port:O(),customPrefix:"Worker did not become ready within 5 seconds."}))}async function q(s){if(await y(),!s)throw new Error("saveHook requires input");let{session_id:t,cwd:r,tool_name:e,tool_input:n,tool_response:o}=s,i=O(),E=_.formatTool(e,n);if(_.dataIn("HOOK",`PostToolUse: ${E}`,{workerPort:i}),!r)throw new Error(`Missing cwd in PostToolUse hook input for session ${t}, tool ${e}`);let l=await fetch(`http://127.0.0.1:${i}/api/sessions/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({contentSessionId:t,tool_name:e,tool_input:n,tool_response:o,cwd:r}),signal:AbortSignal.timeout(u.DEFAULT)});if(!l.ok)throw new Error(`Observation storage failed: ${l.status}`);_.debug("HOOK","Observation sent successfully",{toolName:e}),console.log(U)}var L="";P.on("data",s=>L+=s);P.on("end",async()=>{let s;try{s=L?JSON.parse(L):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await q(s)});
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
#!/usr/bin/env bun
|
||||
import{stdin as k}from"process";var f=JSON.stringify({continue:!0,suppressOutput:!0});import{readFileSync as v,writeFileSync as F,existsSync as x}from"fs";import{join as H}from"path";import{homedir as W}from"os";var d="bugfix,feature,refactor,discovery,decision,change",h="how-it-works,why-it-exists,what-changed,problem-solution,gotcha,pattern,trade-off";var c=class{static DEFAULTS={CLAUDE_MEM_MODEL:"claude-sonnet-4-5",CLAUDE_MEM_CONTEXT_OBSERVATIONS:"50",CLAUDE_MEM_WORKER_PORT:"37777",CLAUDE_MEM_WORKER_HOST:"127.0.0.1",CLAUDE_MEM_SKIP_TOOLS:"ListMcpResourcesTool,SlashCommand,Skill,TodoWrite,AskUserQuestion",CLAUDE_MEM_PROVIDER:"claude",CLAUDE_MEM_GEMINI_API_KEY:"",CLAUDE_MEM_GEMINI_MODEL:"gemini-2.5-flash-lite",CLAUDE_MEM_GEMINI_RATE_LIMITING_ENABLED:"true",CLAUDE_MEM_OPENROUTER_API_KEY:"",CLAUDE_MEM_OPENROUTER_MODEL:"xiaomi/mimo-v2-flash:free",CLAUDE_MEM_OPENROUTER_SITE_URL:"",CLAUDE_MEM_OPENROUTER_APP_NAME:"claude-mem",CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_OPENROUTER_MAX_TOKENS:"100000",CLAUDE_MEM_DATA_DIR:H(W(),".claude-mem"),CLAUDE_MEM_LOG_LEVEL:"INFO",CLAUDE_MEM_PYTHON_VERSION:"3.13",CLAUDE_CODE_PATH:"",CLAUDE_MEM_MODE:"code",CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT:"true",CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES:d,CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS:h,CLAUDE_MEM_CONTEXT_FULL_COUNT:"5",CLAUDE_MEM_CONTEXT_FULL_FIELD:"narrative",CLAUDE_MEM_CONTEXT_SESSION_COUNT:"10",CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY:"true",CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE:"false"};static getAllDefaults(){return{...this.DEFAULTS}}static get(t){return this.DEFAULTS[t]}static getInt(t){let r=this.get(t);return parseInt(r,10)}static getBool(t){return this.get(t)==="true"}static loadFromFile(t){try{if(!x(t))return this.getAllDefaults();let r=v(t,"utf-8"),e=JSON.parse(r),n=e;if(e.env&&typeof e.env=="object"){n=e.env;try{F(t,JSON.stringify(n,null,2),"utf-8"),g.info("SETTINGS","Migrated settings file from nested to flat schema",{settingsPath:t})}catch(E){g.warn("SETTINGS","Failed to auto-migrate settings file",{settingsPath:t},E)}}let o={...this.DEFAULTS};for(let E of Object.keys(this.DEFAULTS))n[E]!==void 0&&(o[E]=n[E]);return o}catch(r){return g.warn("SETTINGS","Failed to load settings, using defaults",{settingsPath:t},r),this.getAllDefaults()}}};import{appendFileSync as b,existsSync as G,mkdirSync as K}from"fs";import{join as M}from"path";var p=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(p||{}),A=class{level=null;useColor;logFilePath=null;constructor(){this.useColor=process.stdout.isTTY??!1,this.initializeLogFile()}initializeLogFile(){try{let t=c.get("CLAUDE_MEM_DATA_DIR"),r=M(t,"logs");G(r)||K(r,{recursive:!0});let e=new Date().toISOString().split("T")[0];this.logFilePath=M(r,`claude-mem-${e}.log`)}catch(t){console.error("[LOGGER] Failed to initialize log file:",t),this.logFilePath=null}}getLevel(){if(this.level===null)try{let t=c.get("CLAUDE_MEM_DATA_DIR"),r=M(t,"settings.json"),n=c.loadFromFile(r).CLAUDE_MEM_LOG_LEVEL.toUpperCase();this.level=p[n]??1}catch(t){console.error("[LOGGER] Failed to load settings, using INFO level:",t),this.level=1}return this.level}correlationId(t,r){return`obs-${t}-${r}`}sessionId(t){return`session-${t}`}formatData(t){if(t==null)return"";if(typeof t=="string")return t;if(typeof t=="number"||typeof t=="boolean")return t.toString();if(typeof t=="object"){if(t instanceof Error)return this.getLevel()===0?`${t.message}
|
||||
import{stdin as k}from"process";var f=JSON.stringify({continue:!0,suppressOutput:!0});import{readFileSync as v,writeFileSync as F,existsSync as x}from"fs";import{join as H}from"path";import{homedir as W}from"os";var h="bugfix,feature,refactor,discovery,decision,change",d="how-it-works,why-it-exists,what-changed,problem-solution,gotcha,pattern,trade-off";var c=class{static DEFAULTS={CLAUDE_MEM_MODEL:"claude-sonnet-4-5",CLAUDE_MEM_CONTEXT_OBSERVATIONS:"50",CLAUDE_MEM_WORKER_PORT:"37777",CLAUDE_MEM_WORKER_HOST:"127.0.0.1",CLAUDE_MEM_SKIP_TOOLS:"ListMcpResourcesTool,SlashCommand,Skill,TodoWrite,AskUserQuestion",CLAUDE_MEM_PROVIDER:"claude",CLAUDE_MEM_GEMINI_API_KEY:"",CLAUDE_MEM_GEMINI_MODEL:"gemini-2.5-flash-lite",CLAUDE_MEM_GEMINI_RATE_LIMITING_ENABLED:"true",CLAUDE_MEM_OPENROUTER_API_KEY:"",CLAUDE_MEM_OPENROUTER_MODEL:"xiaomi/mimo-v2-flash:free",CLAUDE_MEM_OPENROUTER_SITE_URL:"",CLAUDE_MEM_OPENROUTER_APP_NAME:"claude-mem",CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES:"20",CLAUDE_MEM_OPENROUTER_MAX_TOKENS:"100000",CLAUDE_MEM_DATA_DIR:H(W(),".claude-mem"),CLAUDE_MEM_LOG_LEVEL:"INFO",CLAUDE_MEM_PYTHON_VERSION:"3.13",CLAUDE_CODE_PATH:"",CLAUDE_MEM_MODE:"code",CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT:"true",CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT:"true",CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES:h,CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS:d,CLAUDE_MEM_CONTEXT_FULL_COUNT:"5",CLAUDE_MEM_CONTEXT_FULL_FIELD:"narrative",CLAUDE_MEM_CONTEXT_SESSION_COUNT:"10",CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY:"true",CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE:"false"};static getAllDefaults(){return{...this.DEFAULTS}}static get(t){return this.DEFAULTS[t]}static getInt(t){let r=this.get(t);return parseInt(r,10)}static getBool(t){return this.get(t)==="true"}static loadFromFile(t){try{if(!x(t))return this.getAllDefaults();let r=v(t,"utf-8"),e=JSON.parse(r),n=e;if(e.env&&typeof e.env=="object"){n=e.env;try{F(t,JSON.stringify(n,null,2),"utf-8"),g.info("SETTINGS","Migrated settings file from nested to flat schema",{settingsPath:t})}catch(E){g.warn("SETTINGS","Failed to auto-migrate settings file",{settingsPath:t},E)}}let o={...this.DEFAULTS};for(let E of Object.keys(this.DEFAULTS))n[E]!==void 0&&(o[E]=n[E]);return o}catch(r){return g.warn("SETTINGS","Failed to load settings, using defaults",{settingsPath:t},r),this.getAllDefaults()}}};import{appendFileSync as b,existsSync as G,mkdirSync as K}from"fs";import{join as M}from"path";var p=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(p||{}),A=class{level=null;useColor;logFilePath=null;constructor(){this.useColor=process.stdout.isTTY??!1,this.initializeLogFile()}initializeLogFile(){try{let t=c.get("CLAUDE_MEM_DATA_DIR"),r=M(t,"logs");G(r)||K(r,{recursive:!0});let e=new Date().toISOString().split("T")[0];this.logFilePath=M(r,`claude-mem-${e}.log`)}catch(t){console.error("[LOGGER] Failed to initialize log file:",t),this.logFilePath=null}}getLevel(){if(this.level===null)try{let t=c.get("CLAUDE_MEM_DATA_DIR"),r=M(t,"settings.json"),n=c.loadFromFile(r).CLAUDE_MEM_LOG_LEVEL.toUpperCase();this.level=p[n]??1}catch(t){console.error("[LOGGER] Failed to load settings, using INFO level:",t),this.level=1}return this.level}correlationId(t,r){return`obs-${t}-${r}`}sessionId(t){return`session-${t}`}formatData(t){if(t==null)return"";if(typeof t=="string")return t;if(typeof t=="number"||typeof t=="boolean")return t.toString();if(typeof t=="object"){if(t instanceof Error)return this.getLevel()===0?`${t.message}
|
||||
${t.stack}`:t.message;if(Array.isArray(t))return`[${t.length} items]`;let r=Object.keys(t);return r.length===0?"{}":r.length<=3?JSON.stringify(t):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(t)}formatTool(t,r){if(!r)return t;let e=typeof r=="string"?JSON.parse(r):r;if(t==="Bash"&&e.command)return`${t}(${e.command})`;if(e.file_path)return`${t}(${e.file_path})`;if(e.notebook_path)return`${t}(${e.notebook_path})`;if(t==="Glob"&&e.pattern)return`${t}(${e.pattern})`;if(t==="Grep"&&e.pattern)return`${t}(${e.pattern})`;if(e.url)return`${t}(${e.url})`;if(e.query)return`${t}(${e.query})`;if(t==="Task"){if(e.subagent_type)return`${t}(${e.subagent_type})`;if(e.description)return`${t}(${e.description})`}return t==="Skill"&&e.skill?`${t}(${e.skill})`:t==="LSP"&&e.operation?`${t}(${e.operation})`:t}formatTimestamp(t){let r=t.getFullYear(),e=String(t.getMonth()+1).padStart(2,"0"),n=String(t.getDate()).padStart(2,"0"),o=String(t.getHours()).padStart(2,"0"),E=String(t.getMinutes()).padStart(2,"0"),i=String(t.getSeconds()).padStart(2,"0"),_=String(t.getMilliseconds()).padStart(3,"0");return`${r}-${e}-${n} ${o}:${E}:${i}.${_}`}log(t,r,e,n,o){if(t<this.getLevel())return;let E=this.formatTimestamp(new Date),i=p[t].padEnd(5),_=r.padEnd(6),a="";n?.correlationId?a=`[${n.correlationId}] `:n?.sessionId&&(a=`[session-${n.sessionId}] `);let l="";o!=null&&(o instanceof Error?l=this.getLevel()===0?`
|
||||
${o.message}
|
||||
${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?l=`
|
||||
`+JSON.stringify(o,null,2):l=" "+this.formatData(o));let S="";if(n){let{sessionId:U,sdkSessionId:tt,correlationId:et,...R}=n;Object.keys(R).length>0&&(S=` {${Object.entries(R).map(([P,w])=>`${P}=${w}`).join(", ")}}`)}let D=`[${E}] [${i}] [${_}] ${a}${e}${S}${l}`;if(this.logFilePath)try{b(this.logFilePath,D+`
|
||||
`+JSON.stringify(o,null,2):l=" "+this.formatData(o));let S="";if(n){let{sessionId:U,memorySessionId:tt,correlationId:et,...R}=n;Object.keys(R).length>0&&(S=` {${Object.entries(R).map(([P,w])=>`${P}=${w}`).join(", ")}}`)}let D=`[${E}] [${i}] [${_}] ${a}${e}${S}${l}`;if(this.logFilePath)try{b(this.logFilePath,D+`
|
||||
`,"utf8")}catch(U){process.stderr.write(`[LOGGER] Failed to write to log file: ${U}
|
||||
`)}else process.stderr.write(D+`
|
||||
`)}debug(t,r,e,n){this.log(0,t,r,e,n)}info(t,r,e,n){this.log(1,t,r,e,n)}warn(t,r,e,n){this.log(2,t,r,e,n)}error(t,r,e,n){this.log(3,t,r,e,n)}dataIn(t,r,e,n){this.info(t,`\u2192 ${r}`,e,n)}dataOut(t,r,e,n){this.info(t,`\u2190 ${r}`,e,n)}success(t,r,e,n){this.info(t,`\u2713 ${r}`,e,n)}failure(t,r,e,n){this.error(t,`\u2717 ${r}`,e,n)}timing(t,r,e,n){this.info(t,`\u23F1 ${r}`,n,{duration:`${e}ms`})}happyPathError(t,r,e,n,o=""){let a=((new Error().stack||"").split(`
|
||||
@@ -20,4 +20,4 @@ ${i}`),i}var j=L.join(X(),".claude","plugins","marketplaces","thedotmack"),y=I(u
|
||||
`),o=!1;for(let E=n.length-1;E>=0;E--){let i=JSON.parse(n[E]);if(i.type===t&&(o=!0,i.message?.content)){let _="",a=i.message.content;if(typeof a=="string")_=a;else if(Array.isArray(a))_=a.filter(l=>l.type==="text").map(l=>l.text).join(`
|
||||
`);else throw new Error(`Unknown message content format in transcript. Type: ${typeof a}`);return r&&(_=_.replace(/<system-reminder>[\s\S]*?<\/system-reminder>/g,""),_=_.replace(/\n{3,}/g,`
|
||||
|
||||
`).trim()),_}}if(!o)throw new Error(`No message found for role '${t}' in transcript: ${s}`);return""}async function Z(s){if(await $(),!s)throw new Error("summaryHook requires input");let{session_id:t}=s,r=O();if(!s.transcript_path)throw new Error(`Missing transcript_path in Stop hook input for session ${t}`);let e=m(s.transcript_path,"user"),n=m(s.transcript_path,"assistant",!0);g.dataIn("HOOK","Stop: Requesting summary",{workerPort:r,hasLastUserMessage:!!e,hasLastAssistantMessage:!!n});let o=await fetch(`http://127.0.0.1:${r}/api/sessions/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({claudeSessionId:t,last_user_message:e,last_assistant_message:n}),signal:AbortSignal.timeout(u.DEFAULT)});if(!o.ok)throw console.log(f),new Error(`Summary generation failed: ${o.status}`);g.debug("HOOK","Summary request sent successfully"),console.log(f)}var C="";k.on("data",s=>C+=s);k.on("end",async()=>{let s;try{s=C?JSON.parse(C):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await Z(s)});
|
||||
`).trim()),_}}if(!o)throw new Error(`No message found for role '${t}' in transcript: ${s}`);return""}async function Z(s){if(await $(),!s)throw new Error("summaryHook requires input");let{session_id:t}=s,r=O();if(!s.transcript_path)throw new Error(`Missing transcript_path in Stop hook input for session ${t}`);let e=m(s.transcript_path,"user"),n=m(s.transcript_path,"assistant",!0);g.dataIn("HOOK","Stop: Requesting summary",{workerPort:r,hasLastUserMessage:!!e,hasLastAssistantMessage:!!n});let o=await fetch(`http://127.0.0.1:${r}/api/sessions/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({contentSessionId:t,last_user_message:e,last_assistant_message:n}),signal:AbortSignal.timeout(u.DEFAULT)});if(!o.ok)throw console.log(f),new Error(`Summary generation failed: ${o.status}`);g.debug("HOOK","Summary request sent successfully"),console.log(f)}var C="";k.on("data",s=>C+=s);k.on("end",async()=>{let s;try{s=C?JSON.parse(C):void 0}catch(t){throw new Error(`Failed to parse hook input: ${t instanceof Error?t.message:String(t)}`)}await Z(s)});
|
||||
|
||||
@@ -3,11 +3,11 @@ import{basename as z}from"path";import L from"path";import{homedir as K}from"os"
|
||||
${t.stack}`:t.message;if(Array.isArray(t))return`[${t.length} items]`;let r=Object.keys(t);return r.length===0?"{}":r.length<=3?JSON.stringify(t):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(t)}formatTool(t,r){if(!r)return t;let e=typeof r=="string"?JSON.parse(r):r;if(t==="Bash"&&e.command)return`${t}(${e.command})`;if(e.file_path)return`${t}(${e.file_path})`;if(e.notebook_path)return`${t}(${e.notebook_path})`;if(t==="Glob"&&e.pattern)return`${t}(${e.pattern})`;if(t==="Grep"&&e.pattern)return`${t}(${e.pattern})`;if(e.url)return`${t}(${e.url})`;if(e.query)return`${t}(${e.query})`;if(t==="Task"){if(e.subagent_type)return`${t}(${e.subagent_type})`;if(e.description)return`${t}(${e.description})`}return t==="Skill"&&e.skill?`${t}(${e.skill})`:t==="LSP"&&e.operation?`${t}(${e.operation})`:t}formatTimestamp(t){let r=t.getFullYear(),e=String(t.getMonth()+1).padStart(2,"0"),n=String(t.getDate()).padStart(2,"0"),o=String(t.getHours()).padStart(2,"0"),E=String(t.getMinutes()).padStart(2,"0"),s=String(t.getSeconds()).padStart(2,"0"),T=String(t.getMilliseconds()).padStart(3,"0");return`${r}-${e}-${n} ${o}:${E}:${s}.${T}`}log(t,r,e,n,o){if(t<this.getLevel())return;let E=this.formatTimestamp(new Date),s=S[t].padEnd(5),T=r.padEnd(6),a="";n?.correlationId?a=`[${n.correlationId}] `:n?.sessionId&&(a=`[session-${n.sessionId}] `);let l="";o!=null&&(o instanceof Error?l=this.getLevel()===0?`
|
||||
${o.message}
|
||||
${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?l=`
|
||||
`+JSON.stringify(o,null,2):l=" "+this.formatData(o));let u="";if(n){let{sessionId:D,sdkSessionId:Z,correlationId:tt,...m}=n;Object.keys(m).length>0&&(u=` {${Object.entries(m).map(([P,k])=>`${P}=${k}`).join(", ")}}`)}let C=`[${E}] [${s}] [${T}] ${a}${e}${u}${l}`;if(this.logFilePath)try{x(this.logFilePath,C+`
|
||||
`+JSON.stringify(o,null,2):l=" "+this.formatData(o));let u="";if(n){let{sessionId:D,memorySessionId:Z,correlationId:tt,...m}=n;Object.keys(m).length>0&&(u=` {${Object.entries(m).map(([P,k])=>`${P}=${k}`).join(", ")}}`)}let C=`[${E}] [${s}] [${T}] ${a}${e}${u}${l}`;if(this.logFilePath)try{x(this.logFilePath,C+`
|
||||
`,"utf8")}catch(D){process.stderr.write(`[LOGGER] Failed to write to log file: ${D}
|
||||
`)}else process.stderr.write(C+`
|
||||
`)}debug(t,r,e,n){this.log(0,t,r,e,n)}info(t,r,e,n){this.log(1,t,r,e,n)}warn(t,r,e,n){this.log(2,t,r,e,n)}error(t,r,e,n){this.log(3,t,r,e,n)}dataIn(t,r,e,n){this.info(t,`\u2192 ${r}`,e,n)}dataOut(t,r,e,n){this.info(t,`\u2190 ${r}`,e,n)}success(t,r,e,n){this.info(t,`\u2713 ${r}`,e,n)}failure(t,r,e,n){this.error(t,`\u2717 ${r}`,e,n)}timing(t,r,e,n){this.info(t,`\u23F1 ${r}`,n,{duration:`${e}ms`})}happyPathError(t,r,e,n,o=""){let a=((new Error().stack||"").split(`
|
||||
`)[2]||"").match(/at\s+(?:.*\s+)?\(?([^:]+):(\d+):(\d+)\)?/),l=a?`${a[1].split("/").pop()}:${a[2]}`:"unknown",u={...e,location:l};return this.warn(t,`[HAPPY-PATH] ${r}`,u,n),o}},c=new p;var A={DEFAULT:3e5,HEALTH_CHECK:3e4,WORKER_STARTUP_WAIT:1e3,WORKER_STARTUP_RETRIES:300,PRE_RESTART_SETTLE_DELAY:2e3,WINDOWS_MULTIPLIER:1.5},h={SUCCESS:0,FAILURE:1,USER_MESSAGE_ONLY:3};function I(i){return process.platform==="win32"?Math.round(i*A.WINDOWS_MULTIPLIER):i}function d(i={}){let{port:t,includeSkillFallback:r=!1,customPrefix:e,actualError:n}=i,o=e||"Worker service connection failed.",E=t?` (port ${t})`:"",s=`${o}${E}
|
||||
`)[2]||"").match(/at\s+(?:.*\s+)?\(?([^:]+):(\d+):(\d+)\)?/),l=a?`${a[1].split("/").pop()}:${a[2]}`:"unknown",u={...e,location:l};return this.warn(t,`[HAPPY-PATH] ${r}`,u,n),o}},c=new p;var A={DEFAULT:3e5,HEALTH_CHECK:3e4,WORKER_STARTUP_WAIT:1e3,WORKER_STARTUP_RETRIES:300,PRE_RESTART_SETTLE_DELAY:2e3,WINDOWS_MULTIPLIER:1.5},h={SUCCESS:0,FAILURE:1,USER_MESSAGE_ONLY:3};function I(i){return process.platform==="win32"?Math.round(i*A.WINDOWS_MULTIPLIER):i}function N(i={}){let{port:t,includeSkillFallback:r=!1,customPrefix:e,actualError:n}=i,o=e||"Worker service connection failed.",E=t?` (port ${t})`:"",s=`${o}${E}
|
||||
|
||||
`;return s+=`To restart the worker:
|
||||
`,s+=`1. Exit Claude Code completely
|
||||
@@ -16,7 +16,7 @@ ${o.stack}`:` ${o.message}`:this.getLevel()===0&&typeof o=="object"?l=`
|
||||
|
||||
If that doesn't work, try: /troubleshoot`),n&&(s=`Worker Error: ${n}
|
||||
|
||||
${s}`),s}var V=L.join(K(),".claude","plugins","marketplaces","thedotmack"),N=I(A.HEALTH_CHECK),M=null;function g(){if(M!==null)return M;let i=L.join(_.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=_.loadFromFile(i);return M=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),M}async function j(){let i=g();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(N)})).ok}function B(){let i=L.join(V,"package.json");return JSON.parse(X(i,"utf-8")).version}async function Y(){let i=g(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(N)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let i=B(),t=await Y();i!==t&&c.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function y(){for(let r=0;r<25;r++){try{if(await j()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(d({port:g(),customPrefix:"Worker did not become ready within 5 seconds."}))}await y();var $=g(),q=z(process.cwd()),f=await fetch(`http://127.0.0.1:${$}/api/context/inject?project=${encodeURIComponent(q)}&colors=true`,{method:"GET",signal:AbortSignal.timeout(5e3)});if(!f.ok)throw new Error(`Failed to fetch context: ${f.status}`);var Q=await f.text();console.error(`
|
||||
${s}`),s}var V=L.join(K(),".claude","plugins","marketplaces","thedotmack"),d=I(A.HEALTH_CHECK),M=null;function g(){if(M!==null)return M;let i=L.join(_.get("CLAUDE_MEM_DATA_DIR"),"settings.json"),t=_.loadFromFile(i);return M=parseInt(t.CLAUDE_MEM_WORKER_PORT,10),M}async function j(){let i=g();return(await fetch(`http://127.0.0.1:${i}/api/readiness`,{signal:AbortSignal.timeout(d)})).ok}function B(){let i=L.join(V,"package.json");return JSON.parse(X(i,"utf-8")).version}async function Y(){let i=g(),t=await fetch(`http://127.0.0.1:${i}/api/version`,{signal:AbortSignal.timeout(d)});if(!t.ok)throw new Error(`Failed to get worker version: ${t.status}`);return(await t.json()).version}async function J(){let i=B(),t=await Y();i!==t&&c.warn("SYSTEM","Worker version mismatch",{pluginVersion:i,workerVersion:t,hint:"Restart worker with: claude-mem worker restart"})}async function y(){for(let r=0;r<25;r++){try{if(await j()){await J();return}}catch{}await new Promise(e=>setTimeout(e,200))}throw new Error(N({port:g(),customPrefix:"Worker did not become ready within 5 seconds."}))}await y();var $=g(),q=z(process.cwd()),f=await fetch(`http://127.0.0.1:${$}/api/context/inject?project=${encodeURIComponent(q)}&colors=true`,{method:"GET",signal:AbortSignal.timeout(5e3)});if(!f.ok)throw new Error(`Failed to fetch context: ${f.status}`);var Q=await f.text();console.error(`
|
||||
|
||||
\u{1F4DD} Claude-Mem Context Loaded
|
||||
\u2139\uFE0F Note: This appears as stderr but is informational only
|
||||
|
||||
+286
-275
File diff suppressed because one or more lines are too long
Binary file not shown.
@@ -216,18 +216,18 @@ function main() {
|
||||
|
||||
// Try to find existing session first
|
||||
const existingQuery = db['db'].prepare(`
|
||||
SELECT sdk_session_id
|
||||
SELECT memory_session_id
|
||||
FROM sdk_sessions
|
||||
WHERE claude_session_id = ?
|
||||
WHERE content_session_id = ?
|
||||
`);
|
||||
const existing = existingQuery.get(sessionMeta.sessionId) as { sdk_session_id: string | null } | undefined;
|
||||
const existing = existingQuery.get(sessionMeta.sessionId) as { memory_session_id: string | null } | undefined;
|
||||
|
||||
if (existing && existing.sdk_session_id) {
|
||||
if (existing && existing.memory_session_id) {
|
||||
// Use existing SDK session ID
|
||||
claudeSessionToSdkSession.set(sessionMeta.sessionId, existing.sdk_session_id);
|
||||
} else if (existing && !existing.sdk_session_id) {
|
||||
// Session exists but sdk_session_id is NULL, update it
|
||||
db['db'].prepare('UPDATE sdk_sessions SET sdk_session_id = ? WHERE claude_session_id = ?')
|
||||
claudeSessionToSdkSession.set(sessionMeta.sessionId, existing.memory_session_id);
|
||||
} else if (existing && !existing.memory_session_id) {
|
||||
// Session exists but memory_session_id is NULL, update it
|
||||
db['db'].prepare('UPDATE sdk_sessions SET memory_session_id = ? WHERE content_session_id = ?')
|
||||
.run(syntheticSdkSessionId, sessionMeta.sessionId);
|
||||
claudeSessionToSdkSession.set(sessionMeta.sessionId, syntheticSdkSessionId);
|
||||
} else {
|
||||
@@ -239,7 +239,7 @@ function main() {
|
||||
);
|
||||
|
||||
// Update with synthetic SDK session ID
|
||||
db['db'].prepare('UPDATE sdk_sessions SET sdk_session_id = ? WHERE claude_session_id = ?')
|
||||
db['db'].prepare('UPDATE sdk_sessions SET memory_session_id = ? WHERE content_session_id = ?')
|
||||
.run(syntheticSdkSessionId, sessionMeta.sessionId);
|
||||
|
||||
claudeSessionToSdkSession.set(sessionMeta.sessionId, syntheticSdkSessionId);
|
||||
@@ -289,8 +289,8 @@ function main() {
|
||||
}
|
||||
|
||||
// Get SDK session ID
|
||||
const sdkSessionId = claudeSessionToSdkSession.get(sessionMeta.sessionId);
|
||||
if (!sdkSessionId) {
|
||||
const memorySessionId = claudeSessionToSdkSession.get(sessionMeta.sessionId);
|
||||
if (!memorySessionId) {
|
||||
skipped++;
|
||||
continue;
|
||||
}
|
||||
@@ -301,8 +301,8 @@ function main() {
|
||||
// Check for duplicate
|
||||
const existingObs = db['db'].prepare(`
|
||||
SELECT id FROM observations
|
||||
WHERE sdk_session_id = ? AND title = ? AND subtitle = ? AND type = ?
|
||||
`).get(sdkSessionId, observation.title, observation.subtitle, observation.type);
|
||||
WHERE memory_session_id = ? AND title = ? AND subtitle = ? AND type = ?
|
||||
`).get(memorySessionId, observation.title, observation.subtitle, observation.type);
|
||||
|
||||
if (existingObs) {
|
||||
duplicateObs++;
|
||||
@@ -311,7 +311,7 @@ function main() {
|
||||
|
||||
try {
|
||||
db.storeObservation(
|
||||
sdkSessionId,
|
||||
memorySessionId,
|
||||
sessionMeta.project,
|
||||
observation
|
||||
);
|
||||
@@ -333,8 +333,8 @@ function main() {
|
||||
// Check for duplicate
|
||||
const existingSum = db['db'].prepare(`
|
||||
SELECT id FROM session_summaries
|
||||
WHERE sdk_session_id = ? AND request = ? AND completed = ? AND learned = ?
|
||||
`).get(sdkSessionId, summary.request, summary.completed, summary.learned);
|
||||
WHERE memory_session_id = ? AND request = ? AND completed = ? AND learned = ?
|
||||
`).get(memorySessionId, summary.request, summary.completed, summary.learned);
|
||||
|
||||
if (existingSum) {
|
||||
duplicateSum++;
|
||||
@@ -343,7 +343,7 @@ function main() {
|
||||
|
||||
try {
|
||||
db.storeSummary(
|
||||
sdkSessionId,
|
||||
memorySessionId,
|
||||
sessionMeta.project,
|
||||
summary
|
||||
);
|
||||
|
||||
@@ -29,14 +29,14 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
|
||||
const port = getWorkerPort();
|
||||
|
||||
logger.info('HOOK', 'new-hook: Calling /api/sessions/init', { claudeSessionId: session_id, project, prompt_length: prompt?.length });
|
||||
logger.info('HOOK', 'new-hook: Calling /api/sessions/init', { contentSessionId: session_id, project, prompt_length: prompt?.length });
|
||||
|
||||
// Initialize session via HTTP - handles DB operations and privacy checks
|
||||
const initResponse = await fetch(`http://127.0.0.1:${port}/api/sessions/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
contentSessionId: session_id,
|
||||
project,
|
||||
prompt
|
||||
}),
|
||||
|
||||
@@ -51,7 +51,7 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
contentSessionId: session_id,
|
||||
tool_name,
|
||||
tool_input,
|
||||
tool_response,
|
||||
|
||||
@@ -57,7 +57,7 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
claudeSessionId: session_id,
|
||||
contentSessionId: session_id,
|
||||
last_user_message: lastUserMessage,
|
||||
last_assistant_message: lastAssistantMessage
|
||||
}),
|
||||
|
||||
+5
-5
@@ -17,7 +17,7 @@ export interface Observation {
|
||||
|
||||
export interface SDKSession {
|
||||
id: number;
|
||||
sdk_session_id: string | null;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
last_user_message?: string;
|
||||
@@ -148,14 +148,14 @@ ${mode.prompts.summary_footer}`;
|
||||
/**
|
||||
* Build prompt for continuation of existing session
|
||||
*
|
||||
* CRITICAL: Why claudeSessionId Parameter is Required
|
||||
* CRITICAL: Why contentSessionId Parameter is Required
|
||||
* ====================================================
|
||||
* This function receives claudeSessionId from SDKAgent.ts, which comes from:
|
||||
* This function receives contentSessionId from SDKAgent.ts, which comes from:
|
||||
* - SessionManager.initializeSession (fetched from database)
|
||||
* - SessionStore.createSDKSession (stored by new-hook.ts)
|
||||
* - new-hook.ts receives it from Claude Code's hook context
|
||||
*
|
||||
* The claudeSessionId is the SAME session_id used by:
|
||||
* The contentSessionId is the SAME session_id used by:
|
||||
* - NEW hook (to create/fetch session)
|
||||
* - SAVE hook (to store observations)
|
||||
* - This continuation prompt (to maintain session context)
|
||||
@@ -166,7 +166,7 @@ ${mode.prompts.summary_footer}`;
|
||||
* Called when: promptNumber > 1 (see SDKAgent.ts line 150)
|
||||
* First prompt: Uses buildInitPrompt instead (promptNumber === 1)
|
||||
*/
|
||||
export function buildContinuationPrompt(userPrompt: string, promptNumber: number, claudeSessionId: string, mode: ModeConfig): string {
|
||||
export function buildContinuationPrompt(userPrompt: string, promptNumber: number, contentSessionId: string, mode: ModeConfig): string {
|
||||
return `${mode.prompts.continuation_greeting}
|
||||
|
||||
<observed_from_primary_session>
|
||||
|
||||
@@ -0,0 +1,70 @@
|
||||
import { EventEmitter } from 'events';
|
||||
import { PendingMessageStore, PersistentPendingMessage } from '../sqlite/PendingMessageStore.js';
|
||||
import type { PendingMessageWithId } from '../worker-types.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
|
||||
export class SessionQueueProcessor {
|
||||
constructor(
|
||||
private store: PendingMessageStore,
|
||||
private events: EventEmitter
|
||||
) {}
|
||||
|
||||
/**
|
||||
* Create an async iterator that yields messages as they become available.
|
||||
* Uses atomic database claiming to prevent race conditions.
|
||||
* Waits for 'message' event when queue is empty.
|
||||
*/
|
||||
async *createIterator(sessionDbId: number, signal: AbortSignal): AsyncIterableIterator<PendingMessageWithId> {
|
||||
while (!signal.aborted) {
|
||||
try {
|
||||
// 1. Atomically claim next message from DB
|
||||
const persistentMessage = this.store.claimNextMessage(sessionDbId);
|
||||
|
||||
if (persistentMessage) {
|
||||
// Yield the message for processing
|
||||
yield this.toPendingMessageWithId(persistentMessage);
|
||||
} else {
|
||||
// 2. Queue empty - wait for wake-up event
|
||||
// We use a promise that resolves on 'message' event or abort
|
||||
await this.waitForMessage(signal);
|
||||
}
|
||||
} catch (error) {
|
||||
if (signal.aborted) return;
|
||||
logger.error('SESSION', 'Error in queue processor loop', { sessionDbId }, error as Error);
|
||||
// Small backoff to prevent tight loop on DB error
|
||||
await new Promise(resolve => setTimeout(resolve, 1000));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private toPendingMessageWithId(msg: PersistentPendingMessage): PendingMessageWithId {
|
||||
const pending = this.store.toPendingMessage(msg);
|
||||
return {
|
||||
...pending,
|
||||
_persistentId: msg.id,
|
||||
_originalTimestamp: msg.created_at_epoch
|
||||
};
|
||||
}
|
||||
|
||||
private waitForMessage(signal: AbortSignal): Promise<void> {
|
||||
return new Promise<void>((resolve) => {
|
||||
const onMessage = () => {
|
||||
cleanup();
|
||||
resolve();
|
||||
};
|
||||
|
||||
const onAbort = () => {
|
||||
cleanup();
|
||||
resolve(); // Resolve to let the loop check signal.aborted and exit
|
||||
};
|
||||
|
||||
const cleanup = () => {
|
||||
this.events.off('message', onMessage);
|
||||
signal.removeEventListener('abort', onAbort);
|
||||
};
|
||||
|
||||
this.events.once('message', onMessage);
|
||||
signal.addEventListener('abort', onAbort, { once: true });
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -8,7 +8,7 @@ import { logger } from '../../utils/logger.js';
|
||||
export interface PersistentPendingMessage {
|
||||
id: number;
|
||||
session_db_id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
message_type: 'observation' | 'summarize';
|
||||
tool_name: string | null;
|
||||
tool_input: string | null;
|
||||
@@ -53,11 +53,11 @@ export class PendingMessageStore {
|
||||
* Enqueue a new message (persist before processing)
|
||||
* @returns The database ID of the persisted message
|
||||
*/
|
||||
enqueue(sessionDbId: number, claudeSessionId: string, message: PendingMessage): number {
|
||||
enqueue(sessionDbId: number, contentSessionId: string, message: PendingMessage): number {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO pending_messages (
|
||||
session_db_id, claude_session_id, message_type,
|
||||
session_db_id, content_session_id, message_type,
|
||||
tool_name, tool_input, tool_response, cwd,
|
||||
last_user_message, last_assistant_message,
|
||||
prompt_number, status, retry_count, created_at_epoch
|
||||
@@ -66,7 +66,7 @@ export class PendingMessageStore {
|
||||
|
||||
const result = stmt.run(
|
||||
sessionDbId,
|
||||
claudeSessionId,
|
||||
contentSessionId,
|
||||
message.type,
|
||||
message.tool_name || null,
|
||||
message.tool_input ? JSON.stringify(message.tool_input) : null,
|
||||
@@ -82,17 +82,41 @@ export class PendingMessageStore {
|
||||
}
|
||||
|
||||
/**
|
||||
* Peek at oldest pending message for session (does NOT change status)
|
||||
* @returns The oldest pending message or null if none
|
||||
* Atomically claim the next pending message for processing
|
||||
* Finds oldest pending -> marks processing -> returns it
|
||||
* Uses a transaction to prevent race conditions
|
||||
*/
|
||||
peekPending(sessionDbId: number): PersistentPendingMessage | null {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT * FROM pending_messages
|
||||
WHERE session_db_id = ? AND status = 'pending'
|
||||
ORDER BY id ASC
|
||||
LIMIT 1
|
||||
`);
|
||||
return stmt.get(sessionDbId) as PersistentPendingMessage | null;
|
||||
claimNextMessage(sessionDbId: number): PersistentPendingMessage | null {
|
||||
const now = Date.now();
|
||||
|
||||
const claimTx = this.db.transaction((sessionId: number, timestamp: number) => {
|
||||
const peekStmt = this.db.prepare(`
|
||||
SELECT * FROM pending_messages
|
||||
WHERE session_db_id = ? AND status = 'pending'
|
||||
ORDER BY id ASC
|
||||
LIMIT 1
|
||||
`);
|
||||
const msg = peekStmt.get(sessionId) as PersistentPendingMessage | null;
|
||||
|
||||
if (msg) {
|
||||
const updateStmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'processing', started_processing_at_epoch = ?
|
||||
WHERE id = ?
|
||||
`);
|
||||
updateStmt.run(timestamp, msg.id);
|
||||
|
||||
// Return updated object
|
||||
return {
|
||||
...msg,
|
||||
status: 'processing',
|
||||
started_processing_at_epoch: timestamp
|
||||
} as PersistentPendingMessage;
|
||||
}
|
||||
return null;
|
||||
});
|
||||
|
||||
return claimTx(sessionDbId, now) as PersistentPendingMessage | null;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -116,7 +140,7 @@ export class PendingMessageStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT pm.*, ss.project
|
||||
FROM pending_messages pm
|
||||
LEFT JOIN sdk_sessions ss ON pm.claude_session_id = ss.claude_session_id
|
||||
LEFT JOIN sdk_sessions ss ON pm.content_session_id = ss.content_session_id
|
||||
WHERE pm.status IN ('pending', 'processing', 'failed')
|
||||
ORDER BY
|
||||
CASE pm.status
|
||||
@@ -202,7 +226,7 @@ export class PendingMessageStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT pm.*, ss.project
|
||||
FROM pending_messages pm
|
||||
LEFT JOIN sdk_sessions ss ON pm.claude_session_id = ss.claude_session_id
|
||||
LEFT JOIN sdk_sessions ss ON pm.content_session_id = ss.content_session_id
|
||||
WHERE pm.status = 'processed' AND pm.completed_at_epoch > ?
|
||||
ORDER BY pm.completed_at_epoch DESC
|
||||
LIMIT ?
|
||||
@@ -330,12 +354,12 @@ export class PendingMessageStore {
|
||||
/**
|
||||
* Get session info for a pending message (for recovery)
|
||||
*/
|
||||
getSessionInfoForMessage(messageId: number): { sessionDbId: number; claudeSessionId: string } | null {
|
||||
getSessionInfoForMessage(messageId: number): { sessionDbId: number; contentSessionId: string } | null {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT session_db_id, claude_session_id FROM pending_messages WHERE id = ?
|
||||
SELECT session_db_id, content_session_id FROM pending_messages WHERE id = ?
|
||||
`);
|
||||
const result = stmt.get(messageId) as { session_db_id: number; claude_session_id: string } | undefined;
|
||||
return result ? { sessionDbId: result.session_db_id, claudeSessionId: result.claude_session_id } : null;
|
||||
const result = stmt.get(messageId) as { session_db_id: number; content_session_id: string } | undefined;
|
||||
return result ? { sessionDbId: result.session_db_id, contentSessionId: result.content_session_id } : null;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -481,7 +481,7 @@ export class SessionSearch {
|
||||
const sql = `
|
||||
SELECT up.*
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
${whereClause}
|
||||
${orderClause}
|
||||
LIMIT ? OFFSET ?
|
||||
@@ -498,23 +498,23 @@ export class SessionSearch {
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all prompts for a session by claude_session_id
|
||||
* Get all prompts for a session by content_session_id
|
||||
*/
|
||||
getUserPromptsBySession(claudeSessionId: string): UserPromptRow[] {
|
||||
getUserPromptsBySession(contentSessionId: string): UserPromptRow[] {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
id,
|
||||
claude_session_id,
|
||||
content_session_id,
|
||||
prompt_number,
|
||||
prompt_text,
|
||||
created_at,
|
||||
created_at_epoch
|
||||
FROM user_prompts
|
||||
WHERE claude_session_id = ?
|
||||
WHERE content_session_id = ?
|
||||
ORDER BY prompt_number ASC
|
||||
`);
|
||||
|
||||
return stmt.all(claudeSessionId) as UserPromptRow[];
|
||||
return stmt.all(contentSessionId) as UserPromptRow[];
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
+189
-127
@@ -43,6 +43,7 @@ export class SessionStore {
|
||||
this.createUserPromptsTable();
|
||||
this.ensureDiscoveryTokensColumn();
|
||||
this.createPendingMessagesTable();
|
||||
this.renameSessionIdColumns();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -73,8 +74,8 @@ export class SessionStore {
|
||||
this.db.run(`
|
||||
CREATE TABLE IF NOT EXISTS sdk_sessions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT UNIQUE NOT NULL,
|
||||
sdk_session_id TEXT UNIQUE,
|
||||
content_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT UNIQUE,
|
||||
project TEXT NOT NULL,
|
||||
user_prompt TEXT,
|
||||
started_at TEXT NOT NULL,
|
||||
@@ -84,31 +85,31 @@ export class SessionStore {
|
||||
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(claude_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS observations (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT NOT NULL,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery')),
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS session_summaries (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT UNIQUE NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
@@ -120,10 +121,10 @@ export class SessionStore {
|
||||
notes TEXT,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
@@ -200,7 +201,7 @@ export class SessionStore {
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove UNIQUE constraint from session_summaries.sdk_session_id (migration 7)
|
||||
* Remove UNIQUE constraint from session_summaries.memory_session_id (migration 7)
|
||||
*/
|
||||
private removeSessionSummariesUniqueConstraint(): void {
|
||||
// Check if migration already applied
|
||||
@@ -217,7 +218,7 @@ export class SessionStore {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Removing UNIQUE constraint from session_summaries.sdk_session_id');
|
||||
logger.info('DB', 'Removing UNIQUE constraint from session_summaries.memory_session_id');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
@@ -227,7 +228,7 @@ export class SessionStore {
|
||||
this.db.run(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
@@ -240,14 +241,14 @@ export class SessionStore {
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Copy data from old table
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_new
|
||||
SELECT id, sdk_session_id, project, request, investigated, learned,
|
||||
SELECT id, memory_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
@@ -261,7 +262,7 @@ export class SessionStore {
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
||||
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
@@ -272,7 +273,7 @@ export class SessionStore {
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully removed UNIQUE constraint from session_summaries.sdk_session_id');
|
||||
logger.info('DB', 'Successfully removed UNIQUE constraint from session_summaries.memory_session_id');
|
||||
} catch (error: any) {
|
||||
// Rollback on error
|
||||
this.db.run('ROLLBACK');
|
||||
@@ -346,7 +347,7 @@ export class SessionStore {
|
||||
this.db.run(`
|
||||
CREATE TABLE observations_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change')),
|
||||
@@ -360,14 +361,14 @@ export class SessionStore {
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Copy data from old table (all existing columns)
|
||||
this.db.run(`
|
||||
INSERT INTO observations_new
|
||||
SELECT id, sdk_session_id, project, text, type, title, subtitle, facts,
|
||||
SELECT id, memory_session_id, project, text, type, title, subtitle, facts,
|
||||
narrative, concepts, files_read, files_modified, prompt_number,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
@@ -381,7 +382,7 @@ export class SessionStore {
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_observations_sdk_session ON observations(sdk_session_id);
|
||||
CREATE INDEX idx_observations_sdk_session ON observations(memory_session_id);
|
||||
CREATE INDEX idx_observations_project ON observations(project);
|
||||
CREATE INDEX idx_observations_type ON observations(type);
|
||||
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
||||
@@ -423,22 +424,22 @@ export class SessionStore {
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
try {
|
||||
// Create main table (using claude_session_id since sdk_session_id is set asynchronously by worker)
|
||||
// Create main table (using content_session_id since memory_session_id is set asynchronously by worker)
|
||||
this.db.run(`
|
||||
CREATE TABLE user_prompts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT NOT NULL,
|
||||
content_session_id TEXT NOT NULL,
|
||||
prompt_number INTEGER NOT NULL,
|
||||
prompt_text TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(claude_session_id) REFERENCES sdk_sessions(claude_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(content_session_id) REFERENCES sdk_sessions(content_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(claude_session_id);
|
||||
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(content_session_id);
|
||||
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
|
||||
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
|
||||
CREATE INDEX idx_user_prompts_lookup ON user_prompts(claude_session_id, prompt_number);
|
||||
CREATE INDEX idx_user_prompts_lookup ON user_prompts(content_session_id, prompt_number);
|
||||
`);
|
||||
|
||||
// Create FTS5 virtual table
|
||||
@@ -545,7 +546,7 @@ export class SessionStore {
|
||||
CREATE TABLE pending_messages (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
session_db_id INTEGER NOT NULL,
|
||||
claude_session_id TEXT NOT NULL,
|
||||
content_session_id TEXT NOT NULL,
|
||||
message_type TEXT NOT NULL CHECK(message_type IN ('observation', 'summarize')),
|
||||
tool_name TEXT,
|
||||
tool_input TEXT,
|
||||
@@ -565,7 +566,7 @@ export class SessionStore {
|
||||
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_session ON pending_messages(session_db_id)');
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_status ON pending_messages(status)');
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_claude_session ON pending_messages(claude_session_id)');
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_claude_session ON pending_messages(content_session_id)');
|
||||
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(16, new Date().toISOString());
|
||||
|
||||
@@ -576,6 +577,67 @@ export class SessionStore {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Rename session ID columns for semantic clarity (migration 17)
|
||||
* - claude_session_id → content_session_id (user's observed session)
|
||||
* - sdk_session_id → memory_session_id (memory agent's session for resume)
|
||||
*/
|
||||
private renameSessionIdColumns(): void {
|
||||
try {
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(17) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
logger.info('DB', 'Renaming session ID columns for semantic clarity');
|
||||
|
||||
// Check if columns are already renamed (idempotent check)
|
||||
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasContentSessionId = sessionsInfo.some(col => col.name === 'content_session_id');
|
||||
|
||||
if (hasContentSessionId) {
|
||||
// Already renamed, just record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(17, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
// SQLite 3.25+ supports ALTER TABLE RENAME COLUMN
|
||||
// Rename in sdk_sessions table
|
||||
this.db.run('ALTER TABLE sdk_sessions RENAME COLUMN claude_session_id TO content_session_id');
|
||||
this.db.run('ALTER TABLE sdk_sessions RENAME COLUMN sdk_session_id TO memory_session_id');
|
||||
|
||||
// Rename in pending_messages table
|
||||
this.db.run('ALTER TABLE pending_messages RENAME COLUMN claude_session_id TO content_session_id');
|
||||
|
||||
// Rename in observations table
|
||||
this.db.run('ALTER TABLE observations RENAME COLUMN sdk_session_id TO memory_session_id');
|
||||
|
||||
// Rename in session_summaries table
|
||||
this.db.run('ALTER TABLE session_summaries RENAME COLUMN sdk_session_id TO memory_session_id');
|
||||
|
||||
// Rename in user_prompts table
|
||||
this.db.run('ALTER TABLE user_prompts RENAME COLUMN claude_session_id TO content_session_id');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(17, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully renamed session ID columns');
|
||||
} catch (error: any) {
|
||||
logger.error('DB', 'Session ID column rename migration error', undefined, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the memory session ID for a session
|
||||
* Called by SDKAgent when it captures the session ID from the first SDK message
|
||||
*/
|
||||
updateMemorySessionId(sessionDbId: number, memorySessionId: string): void {
|
||||
this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET memory_session_id = ?
|
||||
WHERE id = ?
|
||||
`).run(memorySessionId, sessionDbId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent session summaries for a project
|
||||
*/
|
||||
@@ -608,7 +670,7 @@ export class SessionStore {
|
||||
* Get recent summaries with session info for context display
|
||||
*/
|
||||
getRecentSummariesWithSessionInfo(project: string, limit: number = 3): Array<{
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
request: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
@@ -618,7 +680,7 @@ export class SessionStore {
|
||||
}> {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
sdk_session_id, request, learned, completed, next_steps,
|
||||
memory_session_id, request, learned, completed, next_steps,
|
||||
prompt_number, created_at
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
@@ -708,7 +770,7 @@ export class SessionStore {
|
||||
*/
|
||||
getAllRecentUserPrompts(limit: number = 100): Array<{
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
@@ -718,14 +780,14 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
up.id,
|
||||
up.claude_session_id,
|
||||
up.content_session_id,
|
||||
s.project,
|
||||
up.prompt_number,
|
||||
up.prompt_text,
|
||||
up.created_at,
|
||||
up.created_at_epoch
|
||||
FROM user_prompts up
|
||||
LEFT JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
LEFT JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
ORDER BY up.created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
@@ -752,10 +814,10 @@ export class SessionStore {
|
||||
* Get latest user prompt with session info for a Claude session
|
||||
* Used for syncing prompts to Chroma during session initialization
|
||||
*/
|
||||
getLatestUserPrompt(claudeSessionId: string): {
|
||||
getLatestUserPrompt(contentSessionId: string): {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string;
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
@@ -764,23 +826,23 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.sdk_session_id,
|
||||
s.memory_session_id,
|
||||
s.project
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.claude_session_id = ?
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.content_session_id = ?
|
||||
ORDER BY up.created_at_epoch DESC
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return stmt.get(claudeSessionId) as LatestPromptResult | undefined;
|
||||
return stmt.get(contentSessionId) as LatestPromptResult | undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent sessions with their status and summary info
|
||||
*/
|
||||
getRecentSessionsWithStatus(project: string, limit: number = 3): Array<{
|
||||
sdk_session_id: string | null;
|
||||
memory_session_id: string | null;
|
||||
status: string;
|
||||
started_at: string;
|
||||
user_prompt: string | null;
|
||||
@@ -789,16 +851,16 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
s.sdk_session_id,
|
||||
s.memory_session_id,
|
||||
s.status,
|
||||
s.started_at,
|
||||
s.started_at_epoch,
|
||||
s.user_prompt,
|
||||
CASE WHEN sum.sdk_session_id IS NOT NULL THEN 1 ELSE 0 END as has_summary
|
||||
CASE WHEN sum.memory_session_id IS NOT NULL THEN 1 ELSE 0 END as has_summary
|
||||
FROM sdk_sessions s
|
||||
LEFT JOIN session_summaries sum ON s.sdk_session_id = sum.sdk_session_id
|
||||
WHERE s.project = ? AND s.sdk_session_id IS NOT NULL
|
||||
GROUP BY s.sdk_session_id
|
||||
LEFT JOIN session_summaries sum ON s.memory_session_id = sum.memory_session_id
|
||||
WHERE s.project = ? AND s.memory_session_id IS NOT NULL
|
||||
GROUP BY s.memory_session_id
|
||||
ORDER BY s.started_at_epoch DESC
|
||||
LIMIT ?
|
||||
)
|
||||
@@ -811,7 +873,7 @@ export class SessionStore {
|
||||
/**
|
||||
* Get observations for a specific session
|
||||
*/
|
||||
getObservationsForSession(sdkSessionId: string): Array<{
|
||||
getObservationsForSession(memorySessionId: string): Array<{
|
||||
title: string;
|
||||
subtitle: string;
|
||||
type: string;
|
||||
@@ -820,11 +882,11 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT title, subtitle, type, prompt_number
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
WHERE memory_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`);
|
||||
|
||||
return stmt.all(sdkSessionId);
|
||||
return stmt.all(memorySessionId);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -916,7 +978,7 @@ export class SessionStore {
|
||||
/**
|
||||
* Get summary for a specific session
|
||||
*/
|
||||
getSummaryForSession(sdkSessionId: string): {
|
||||
getSummaryForSession(memorySessionId: string): {
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
@@ -935,28 +997,28 @@ export class SessionStore {
|
||||
files_read, files_edited, notes, prompt_number, created_at,
|
||||
created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE sdk_session_id = ?
|
||||
WHERE memory_session_id = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return stmt.get(sdkSessionId) || null;
|
||||
return stmt.get(memorySessionId) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get aggregated files from all observations for a session
|
||||
*/
|
||||
getFilesForSession(sdkSessionId: string): {
|
||||
getFilesForSession(memorySessionId: string): {
|
||||
filesRead: string[];
|
||||
filesModified: string[];
|
||||
} {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
WHERE memory_session_id = ?
|
||||
`);
|
||||
|
||||
const rows = stmt.all(sdkSessionId) as Array<{
|
||||
const rows = stmt.all(memorySessionId) as Array<{
|
||||
files_read: string | null;
|
||||
files_modified: string | null;
|
||||
}>;
|
||||
@@ -993,13 +1055,13 @@ export class SessionStore {
|
||||
*/
|
||||
getSessionById(id: number): {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string | null;
|
||||
content_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
} | null {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
SELECT id, content_session_id, memory_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
@@ -1012,10 +1074,10 @@ export class SessionStore {
|
||||
* Get SDK sessions by SDK session IDs
|
||||
* Used for exporting session metadata
|
||||
*/
|
||||
getSdkSessionsBySessionIds(sdkSessionIds: string[]): {
|
||||
getSdkSessionsBySessionIds(memorySessionIds: string[]): {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string;
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
started_at: string;
|
||||
@@ -1024,18 +1086,18 @@ export class SessionStore {
|
||||
completed_at_epoch: number | null;
|
||||
status: string;
|
||||
}[] {
|
||||
if (sdkSessionIds.length === 0) return [];
|
||||
if (memorySessionIds.length === 0) return [];
|
||||
|
||||
const placeholders = sdkSessionIds.map(() => '?').join(',');
|
||||
const placeholders = memorySessionIds.map(() => '?').join(',');
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt,
|
||||
SELECT id, content_session_id, memory_session_id, project, user_prompt,
|
||||
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
||||
FROM sdk_sessions
|
||||
WHERE sdk_session_id IN (${placeholders})
|
||||
WHERE memory_session_id IN (${placeholders})
|
||||
ORDER BY started_at_epoch DESC
|
||||
`);
|
||||
|
||||
return stmt.all(...sdkSessionIds) as any[];
|
||||
return stmt.all(...memorySessionIds) as any[];
|
||||
}
|
||||
|
||||
|
||||
@@ -1047,10 +1109,10 @@ export class SessionStore {
|
||||
* Get current prompt number by counting user_prompts for this session
|
||||
* Replaces the prompt_counter column which is no longer maintained
|
||||
*/
|
||||
getPromptNumberFromUserPrompts(claudeSessionId: string): number {
|
||||
getPromptNumberFromUserPrompts(contentSessionId: string): number {
|
||||
const result = this.db.prepare(`
|
||||
SELECT COUNT(*) as count FROM user_prompts WHERE claude_session_id = ?
|
||||
`).get(claudeSessionId) as { count: number };
|
||||
SELECT COUNT(*) as count FROM user_prompts WHERE content_session_id = ?
|
||||
`).get(contentSessionId) as { count: number };
|
||||
return result.count;
|
||||
}
|
||||
|
||||
@@ -1080,20 +1142,20 @@ export class SessionStore {
|
||||
* This is KISS in action: Trust the database UNIQUE constraint and
|
||||
* INSERT OR IGNORE to handle both creation and lookup elegantly.
|
||||
*/
|
||||
createSDKSession(claudeSessionId: string, project: string, userPrompt: string): number {
|
||||
createSDKSession(contentSessionId: string, project: string, userPrompt: string): number {
|
||||
const now = new Date();
|
||||
const nowEpoch = now.getTime();
|
||||
|
||||
// Pure INSERT OR IGNORE - no updates, no complexity
|
||||
this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
(content_session_id, memory_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(claudeSessionId, claudeSessionId, project, userPrompt, now.toISOString(), nowEpoch);
|
||||
`).run(contentSessionId, contentSessionId, project, userPrompt, now.toISOString(), nowEpoch);
|
||||
|
||||
// Return existing or new ID
|
||||
const row = this.db.prepare('SELECT id FROM sdk_sessions WHERE claude_session_id = ?')
|
||||
.get(claudeSessionId) as { id: number };
|
||||
const row = this.db.prepare('SELECT id FROM sdk_sessions WHERE content_session_id = ?')
|
||||
.get(contentSessionId) as { id: number };
|
||||
return row.id;
|
||||
}
|
||||
|
||||
@@ -1103,17 +1165,17 @@ export class SessionStore {
|
||||
/**
|
||||
* Save a user prompt
|
||||
*/
|
||||
saveUserPrompt(claudeSessionId: string, promptNumber: number, promptText: string): number {
|
||||
saveUserPrompt(contentSessionId: string, promptNumber: number, promptText: string): number {
|
||||
const now = new Date();
|
||||
const nowEpoch = now.getTime();
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
(content_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(claudeSessionId, promptNumber, promptText, now.toISOString(), nowEpoch);
|
||||
const result = stmt.run(contentSessionId, promptNumber, promptText, now.toISOString(), nowEpoch);
|
||||
return result.lastInsertRowid as number;
|
||||
}
|
||||
|
||||
@@ -1121,15 +1183,15 @@ export class SessionStore {
|
||||
* Get user prompt by session ID and prompt number
|
||||
* Returns the prompt text, or null if not found
|
||||
*/
|
||||
getUserPrompt(claudeSessionId: string, promptNumber: number): string | null {
|
||||
getUserPrompt(contentSessionId: string, promptNumber: number): string | null {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT prompt_text
|
||||
FROM user_prompts
|
||||
WHERE claude_session_id = ? AND prompt_number = ?
|
||||
WHERE content_session_id = ? AND prompt_number = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
const result = stmt.get(claudeSessionId, promptNumber) as { prompt_text: string } | undefined;
|
||||
const result = stmt.get(contentSessionId, promptNumber) as { prompt_text: string } | undefined;
|
||||
return result?.prompt_text ?? null;
|
||||
}
|
||||
|
||||
@@ -1138,7 +1200,7 @@ export class SessionStore {
|
||||
* Assumes session already exists (created by hook)
|
||||
*/
|
||||
storeObservation(
|
||||
sdkSessionId: string,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
observation: {
|
||||
type: string;
|
||||
@@ -1160,13 +1222,13 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
sdkSessionId,
|
||||
memorySessionId,
|
||||
project,
|
||||
observation.type,
|
||||
observation.title,
|
||||
@@ -1193,7 +1255,7 @@ export class SessionStore {
|
||||
* Assumes session already exists - will fail with FK error if not
|
||||
*/
|
||||
storeSummary(
|
||||
sdkSessionId: string,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
summary: {
|
||||
request: string;
|
||||
@@ -1213,13 +1275,13 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
(memory_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
sdkSessionId,
|
||||
memorySessionId,
|
||||
project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
@@ -1302,9 +1364,9 @@ export class SessionStore {
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.sdk_session_id
|
||||
s.memory_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.id IN (${placeholders}) ${projectFilter}
|
||||
ORDER BY up.created_at_epoch ${orderClause}
|
||||
${limitClause}
|
||||
@@ -1437,9 +1499,9 @@ export class SessionStore {
|
||||
`;
|
||||
|
||||
const promptQuery = `
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
SELECT up.*, s.project, s.memory_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${projectFilter.replace('project', 's.project')}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;
|
||||
@@ -1453,7 +1515,7 @@ export class SessionStore {
|
||||
observations,
|
||||
sessions: sessions.map(s => ({
|
||||
id: s.id,
|
||||
sdk_session_id: s.sdk_session_id,
|
||||
memory_session_id: s.memory_session_id,
|
||||
project: s.project,
|
||||
request: s.request,
|
||||
completed: s.completed,
|
||||
@@ -1463,7 +1525,7 @@ export class SessionStore {
|
||||
})),
|
||||
prompts: prompts.map(p => ({
|
||||
id: p.id,
|
||||
claude_session_id: p.claude_session_id,
|
||||
content_session_id: p.content_session_id,
|
||||
prompt_number: p.prompt_number,
|
||||
prompt_text: p.prompt_text,
|
||||
project: p.project,
|
||||
@@ -1482,7 +1544,7 @@ export class SessionStore {
|
||||
*/
|
||||
getPromptById(id: number): {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project: string;
|
||||
@@ -1492,14 +1554,14 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
p.id,
|
||||
p.claude_session_id,
|
||||
p.content_session_id,
|
||||
p.prompt_number,
|
||||
p.prompt_text,
|
||||
s.project,
|
||||
p.created_at,
|
||||
p.created_at_epoch
|
||||
FROM user_prompts p
|
||||
LEFT JOIN sdk_sessions s ON p.claude_session_id = s.claude_session_id
|
||||
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
||||
WHERE p.id = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
@@ -1512,7 +1574,7 @@ export class SessionStore {
|
||||
*/
|
||||
getPromptsByIds(ids: number[]): Array<{
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project: string;
|
||||
@@ -1525,21 +1587,21 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
p.id,
|
||||
p.claude_session_id,
|
||||
p.content_session_id,
|
||||
p.prompt_number,
|
||||
p.prompt_text,
|
||||
s.project,
|
||||
p.created_at,
|
||||
p.created_at_epoch
|
||||
FROM user_prompts p
|
||||
LEFT JOIN sdk_sessions s ON p.claude_session_id = s.claude_session_id
|
||||
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
||||
WHERE p.id IN (${placeholders})
|
||||
ORDER BY p.created_at_epoch DESC
|
||||
`);
|
||||
|
||||
return stmt.all(...ids) as Array<{
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project: string;
|
||||
@@ -1553,8 +1615,8 @@ export class SessionStore {
|
||||
*/
|
||||
getSessionSummaryById(id: number): {
|
||||
id: number;
|
||||
sdk_session_id: string | null;
|
||||
claude_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
request_summary: string | null;
|
||||
@@ -1566,8 +1628,8 @@ export class SessionStore {
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT
|
||||
id,
|
||||
sdk_session_id,
|
||||
claude_session_id,
|
||||
memory_session_id,
|
||||
content_session_id,
|
||||
project,
|
||||
user_prompt,
|
||||
request_summary,
|
||||
@@ -1599,8 +1661,8 @@ export class SessionStore {
|
||||
* Returns: { imported: boolean, id: number }
|
||||
*/
|
||||
importSdkSession(session: {
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string;
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
started_at: string;
|
||||
@@ -1611,8 +1673,8 @@ export class SessionStore {
|
||||
}): { imported: boolean; id: number } {
|
||||
// Check if session already exists
|
||||
const existing = this.db.prepare(
|
||||
'SELECT id FROM sdk_sessions WHERE claude_session_id = ?'
|
||||
).get(session.claude_session_id) as { id: number } | undefined;
|
||||
'SELECT id FROM sdk_sessions WHERE content_session_id = ?'
|
||||
).get(session.content_session_id) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
@@ -1620,14 +1682,14 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO sdk_sessions (
|
||||
claude_session_id, sdk_session_id, project, user_prompt,
|
||||
content_session_id, memory_session_id, project, user_prompt,
|
||||
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
session.claude_session_id,
|
||||
session.sdk_session_id,
|
||||
session.content_session_id,
|
||||
session.memory_session_id,
|
||||
session.project,
|
||||
session.user_prompt,
|
||||
session.started_at,
|
||||
@@ -1645,7 +1707,7 @@ export class SessionStore {
|
||||
* Returns: { imported: boolean, id: number }
|
||||
*/
|
||||
importSessionSummary(summary: {
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
@@ -1662,8 +1724,8 @@ export class SessionStore {
|
||||
}): { imported: boolean; id: number } {
|
||||
// Check if summary already exists for this session
|
||||
const existing = this.db.prepare(
|
||||
'SELECT id FROM session_summaries WHERE sdk_session_id = ?'
|
||||
).get(summary.sdk_session_id) as { id: number } | undefined;
|
||||
'SELECT id FROM session_summaries WHERE memory_session_id = ?'
|
||||
).get(summary.memory_session_id) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
@@ -1671,14 +1733,14 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO session_summaries (
|
||||
sdk_session_id, project, request, investigated, learned,
|
||||
memory_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, discovery_tokens, created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
summary.sdk_session_id,
|
||||
summary.memory_session_id,
|
||||
summary.project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
@@ -1699,11 +1761,11 @@ export class SessionStore {
|
||||
|
||||
/**
|
||||
* Import observation with duplicate checking
|
||||
* Duplicates are identified by sdk_session_id + title + created_at_epoch
|
||||
* Duplicates are identified by memory_session_id + title + created_at_epoch
|
||||
* Returns: { imported: boolean, id: number }
|
||||
*/
|
||||
importObservation(obs: {
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: string;
|
||||
@@ -1722,8 +1784,8 @@ export class SessionStore {
|
||||
// Check if observation already exists
|
||||
const existing = this.db.prepare(`
|
||||
SELECT id FROM observations
|
||||
WHERE sdk_session_id = ? AND title = ? AND created_at_epoch = ?
|
||||
`).get(obs.sdk_session_id, obs.title, obs.created_at_epoch) as { id: number } | undefined;
|
||||
WHERE memory_session_id = ? AND title = ? AND created_at_epoch = ?
|
||||
`).get(obs.memory_session_id, obs.title, obs.created_at_epoch) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
@@ -1731,14 +1793,14 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO observations (
|
||||
sdk_session_id, project, text, type, title, subtitle,
|
||||
memory_session_id, project, text, type, title, subtitle,
|
||||
facts, narrative, concepts, files_read, files_modified,
|
||||
prompt_number, discovery_tokens, created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
obs.sdk_session_id,
|
||||
obs.memory_session_id,
|
||||
obs.project,
|
||||
obs.text,
|
||||
obs.type,
|
||||
@@ -1760,11 +1822,11 @@ export class SessionStore {
|
||||
|
||||
/**
|
||||
* Import user prompt with duplicate checking
|
||||
* Duplicates are identified by claude_session_id + prompt_number
|
||||
* Duplicates are identified by content_session_id + prompt_number
|
||||
* Returns: { imported: boolean, id: number }
|
||||
*/
|
||||
importUserPrompt(prompt: {
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
created_at: string;
|
||||
@@ -1773,8 +1835,8 @@ export class SessionStore {
|
||||
// Check if prompt already exists
|
||||
const existing = this.db.prepare(`
|
||||
SELECT id FROM user_prompts
|
||||
WHERE claude_session_id = ? AND prompt_number = ?
|
||||
`).get(prompt.claude_session_id, prompt.prompt_number) as { id: number } | undefined;
|
||||
WHERE content_session_id = ? AND prompt_number = ?
|
||||
`).get(prompt.content_session_id, prompt.prompt_number) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
@@ -1782,13 +1844,13 @@ export class SessionStore {
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT INTO user_prompts (
|
||||
claude_session_id, prompt_number, prompt_text,
|
||||
content_session_id, prompt_number, prompt_text,
|
||||
created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
prompt.claude_session_id,
|
||||
prompt.content_session_id,
|
||||
prompt.prompt_number,
|
||||
prompt.prompt_text,
|
||||
prompt.created_at,
|
||||
|
||||
@@ -170,8 +170,8 @@ export const migration003: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS streaming_sessions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT UNIQUE NOT NULL,
|
||||
sdk_session_id TEXT,
|
||||
content_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT,
|
||||
project TEXT NOT NULL,
|
||||
title TEXT,
|
||||
subtitle TEXT,
|
||||
@@ -185,8 +185,8 @@ export const migration003: Migration = {
|
||||
status TEXT NOT NULL DEFAULT 'active'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_claude_id ON streaming_sessions(claude_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_sdk_id ON streaming_sessions(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_claude_id ON streaming_sessions(content_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_sdk_id ON streaming_sessions(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_project ON streaming_sessions(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_status ON streaming_sessions(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_streaming_sessions_started ON streaming_sessions(started_at_epoch DESC);
|
||||
@@ -213,8 +213,8 @@ export const migration004: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS sdk_sessions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT UNIQUE NOT NULL,
|
||||
sdk_session_id TEXT UNIQUE,
|
||||
content_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT UNIQUE,
|
||||
project TEXT NOT NULL,
|
||||
user_prompt TEXT,
|
||||
started_at TEXT NOT NULL,
|
||||
@@ -224,8 +224,8 @@ export const migration004: Migration = {
|
||||
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(claude_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
|
||||
@@ -235,34 +235,34 @@ export const migration004: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS observation_queue (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
tool_name TEXT NOT NULL,
|
||||
tool_input TEXT NOT NULL,
|
||||
tool_output TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
processed_at_epoch INTEGER,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_observation_queue_sdk_session ON observation_queue(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observation_queue_sdk_session ON observation_queue(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observation_queue_processed ON observation_queue(processed_at_epoch);
|
||||
CREATE INDEX IF NOT EXISTS idx_observation_queue_pending ON observation_queue(sdk_session_id, processed_at_epoch);
|
||||
CREATE INDEX IF NOT EXISTS idx_observation_queue_pending ON observation_queue(memory_session_id, processed_at_epoch);
|
||||
`);
|
||||
|
||||
// Observations table - stores extracted observations (what SDK decides is important)
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS observations (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT NOT NULL,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery')),
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
|
||||
@@ -272,7 +272,7 @@ export const migration004: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS session_summaries (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT UNIQUE NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
@@ -284,10 +284,10 @@ export const migration004: Migration = {
|
||||
notes TEXT,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
@@ -329,8 +329,8 @@ export const migration005: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS streaming_sessions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
claude_session_id TEXT UNIQUE NOT NULL,
|
||||
sdk_session_id TEXT,
|
||||
content_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT,
|
||||
project TEXT NOT NULL,
|
||||
title TEXT,
|
||||
subtitle TEXT,
|
||||
@@ -348,13 +348,13 @@ export const migration005: Migration = {
|
||||
db.run(`
|
||||
CREATE TABLE IF NOT EXISTS observation_queue (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
tool_name TEXT NOT NULL,
|
||||
tool_input TEXT NOT NULL,
|
||||
tool_output TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
processed_at_epoch INTEGER,
|
||||
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
|
||||
@@ -188,8 +188,8 @@ export function normalizeTimestamp(timestamp: string | Date | number | undefined
|
||||
*/
|
||||
export interface SDKSessionRow {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string | null;
|
||||
content_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string | null;
|
||||
started_at: string;
|
||||
@@ -203,7 +203,7 @@ export interface SDKSessionRow {
|
||||
|
||||
export interface ObservationRow {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: 'decision' | 'bugfix' | 'feature' | 'refactor' | 'discovery' | 'change';
|
||||
@@ -222,7 +222,7 @@ export interface ObservationRow {
|
||||
|
||||
export interface SessionSummaryRow {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
@@ -240,7 +240,7 @@ export interface SessionSummaryRow {
|
||||
|
||||
export interface UserPromptRow {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
created_at: string;
|
||||
|
||||
@@ -26,7 +26,7 @@ interface ChromaDocument {
|
||||
|
||||
interface StoredObservation {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: string;
|
||||
@@ -45,7 +45,7 @@ interface StoredObservation {
|
||||
|
||||
interface StoredSummary {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
@@ -61,12 +61,12 @@ interface StoredSummary {
|
||||
|
||||
interface StoredUserPrompt {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
}
|
||||
|
||||
@@ -201,7 +201,7 @@ export class ChromaSync {
|
||||
const baseMetadata: Record<string, string | number> = {
|
||||
sqlite_id: obs.id,
|
||||
doc_type: 'observation',
|
||||
sdk_session_id: obs.sdk_session_id,
|
||||
memory_session_id: obs.memory_session_id,
|
||||
project: obs.project,
|
||||
created_at_epoch: obs.created_at_epoch,
|
||||
type: obs.type || 'discovery',
|
||||
@@ -262,7 +262,7 @@ export class ChromaSync {
|
||||
const baseMetadata: Record<string, string | number> = {
|
||||
sqlite_id: summary.id,
|
||||
doc_type: 'session_summary',
|
||||
sdk_session_id: summary.sdk_session_id,
|
||||
memory_session_id: summary.memory_session_id,
|
||||
project: summary.project,
|
||||
created_at_epoch: summary.created_at_epoch,
|
||||
prompt_number: summary.prompt_number || 0
|
||||
@@ -368,7 +368,7 @@ export class ChromaSync {
|
||||
*/
|
||||
async syncObservation(
|
||||
observationId: number,
|
||||
sdkSessionId: string,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
obs: ParsedObservation,
|
||||
promptNumber: number,
|
||||
@@ -378,7 +378,7 @@ export class ChromaSync {
|
||||
// Convert ParsedObservation to StoredObservation format
|
||||
const stored: StoredObservation = {
|
||||
id: observationId,
|
||||
sdk_session_id: sdkSessionId,
|
||||
memory_session_id: memorySessionId,
|
||||
project: project,
|
||||
text: null, // Legacy field, not used
|
||||
type: obs.type,
|
||||
@@ -412,7 +412,7 @@ export class ChromaSync {
|
||||
*/
|
||||
async syncSummary(
|
||||
summaryId: number,
|
||||
sdkSessionId: string,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
summary: ParsedSummary,
|
||||
promptNumber: number,
|
||||
@@ -422,7 +422,7 @@ export class ChromaSync {
|
||||
// Convert ParsedSummary to StoredSummary format
|
||||
const stored: StoredSummary = {
|
||||
id: summaryId,
|
||||
sdk_session_id: sdkSessionId,
|
||||
memory_session_id: memorySessionId,
|
||||
project: project,
|
||||
request: summary.request,
|
||||
investigated: summary.investigated,
|
||||
@@ -458,7 +458,7 @@ export class ChromaSync {
|
||||
metadata: {
|
||||
sqlite_id: prompt.id,
|
||||
doc_type: 'user_prompt',
|
||||
sdk_session_id: prompt.sdk_session_id,
|
||||
memory_session_id: prompt.memory_session_id,
|
||||
project: prompt.project,
|
||||
created_at_epoch: prompt.created_at_epoch,
|
||||
prompt_number: prompt.prompt_number
|
||||
@@ -472,7 +472,7 @@ export class ChromaSync {
|
||||
*/
|
||||
async syncUserPrompt(
|
||||
promptId: number,
|
||||
sdkSessionId: string,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
promptText: string,
|
||||
promptNumber: number,
|
||||
@@ -481,12 +481,12 @@ export class ChromaSync {
|
||||
// Create StoredUserPrompt format
|
||||
const stored: StoredUserPrompt = {
|
||||
id: promptId,
|
||||
claude_session_id: '', // Not needed for Chroma sync
|
||||
content_session_id: '', // Not needed for Chroma sync
|
||||
prompt_number: promptNumber,
|
||||
prompt_text: promptText,
|
||||
created_at: new Date(createdAtEpoch * 1000).toISOString(),
|
||||
created_at_epoch: createdAtEpoch,
|
||||
sdk_session_id: sdkSessionId,
|
||||
memory_session_id: memorySessionId,
|
||||
project: project
|
||||
};
|
||||
|
||||
@@ -697,9 +697,9 @@ export class ChromaSync {
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.sdk_session_id
|
||||
s.memory_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE s.project = ? ${promptExclusionClause}
|
||||
ORDER BY up.id ASC
|
||||
`).all(this.project) as StoredUserPrompt[];
|
||||
@@ -707,7 +707,7 @@ export class ChromaSync {
|
||||
const totalPromptCount = db.db.prepare(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE s.project = ?
|
||||
`).get(this.project) as { count: number };
|
||||
|
||||
|
||||
@@ -103,19 +103,34 @@ function acquireLock(command: string): boolean {
|
||||
startedAt: new Date().toISOString()
|
||||
};
|
||||
|
||||
try {
|
||||
// O_EXCL ensures atomic creation - fails if file exists
|
||||
const fd = fs.openSync(LOCK_FILE, fs.constants.O_CREAT | fs.constants.O_EXCL | fs.constants.O_WRONLY);
|
||||
fs.writeSync(fd, JSON.stringify(lockInfo, null, 2));
|
||||
fs.closeSync(fd);
|
||||
return true;
|
||||
} catch (error: unknown) {
|
||||
if ((error as NodeJS.ErrnoException).code === 'EEXIST') {
|
||||
let retries = 3;
|
||||
while (retries > 0) {
|
||||
try {
|
||||
// O_EXCL ensures atomic creation - fails if file exists
|
||||
const fd = fs.openSync(LOCK_FILE, fs.constants.O_CREAT | fs.constants.O_EXCL | fs.constants.O_WRONLY);
|
||||
fs.writeSync(fd, JSON.stringify(lockInfo, null, 2));
|
||||
fs.closeSync(fd);
|
||||
return true;
|
||||
} catch (error: unknown) {
|
||||
if ((error as NodeJS.ErrnoException).code === 'EEXIST') {
|
||||
return false;
|
||||
}
|
||||
// Retry on ENOENT (can happen on Windows if file/dir state is in flux)
|
||||
if ((error as NodeJS.ErrnoException).code === 'ENOENT') {
|
||||
retries--;
|
||||
if (retries === 0) {
|
||||
logger.warn('SYSTEM', 'Lock acquisition error (ENOENT)', { error: (error as Error).message });
|
||||
return false;
|
||||
}
|
||||
// Ensure directory exists and try again
|
||||
try { mkdirSync(DATA_DIR, { recursive: true }); } catch {}
|
||||
continue;
|
||||
}
|
||||
logger.warn('SYSTEM', 'Lock acquisition error', { error: (error as Error).message });
|
||||
return false;
|
||||
}
|
||||
logger.warn('SYSTEM', 'Lock acquisition error', { error: (error as Error).message });
|
||||
return false;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -468,37 +483,14 @@ export class WorkerService {
|
||||
return;
|
||||
}
|
||||
|
||||
// Delegate to the proper handler by re-processing the request
|
||||
// Since we're already in the middleware chain, we need to call the handler directly
|
||||
const projectName = req.query.project as string;
|
||||
const useColors = req.query.colors === 'true';
|
||||
|
||||
if (!projectName) {
|
||||
res.status(400).json({ error: 'Project parameter is required' });
|
||||
return;
|
||||
}
|
||||
|
||||
// Import context generator (runs in worker, has access to database)
|
||||
const { generateContext } = await import('./context-generator.js');
|
||||
|
||||
// Use project name as CWD (generateContext uses path.basename to get project)
|
||||
const cwd = `/context/${projectName}`;
|
||||
|
||||
// Generate context
|
||||
const contextText = await generateContext(
|
||||
{
|
||||
session_id: 'context-inject-' + Date.now(),
|
||||
cwd: cwd
|
||||
},
|
||||
useColors
|
||||
);
|
||||
|
||||
// Return as plain text
|
||||
res.setHeader('Content-Type', 'text/plain; charset=utf-8');
|
||||
res.send(contextText);
|
||||
// Delegate to the SearchRoutes handler which is registered after this one
|
||||
// This avoids code duplication and "headers already sent" errors
|
||||
next();
|
||||
} catch (error) {
|
||||
logger.error('WORKER', 'Context inject handler failed', {}, error as Error);
|
||||
res.status(500).json({ error: error instanceof Error ? error.message : 'Internal server error' });
|
||||
if (!res.headersSent) {
|
||||
res.status(500).json({ error: error instanceof Error ? error.message : 'Internal server error' });
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
@@ -678,7 +670,18 @@ export class WorkerService {
|
||||
this.resolveInitialization();
|
||||
logger.info('SYSTEM', 'Background initialization complete');
|
||||
|
||||
// Note: Auto-recovery of orphaned queues disabled - use /api/pending-queue/process endpoint instead
|
||||
// Auto-recover orphaned queues on startup (process pending work from previous sessions)
|
||||
this.processPendingQueues(50).then(result => {
|
||||
if (result.sessionsStarted > 0) {
|
||||
logger.info('SYSTEM', `Auto-recovered ${result.sessionsStarted} sessions with pending work`, {
|
||||
totalPending: result.totalPendingSessions,
|
||||
started: result.sessionsStarted,
|
||||
sessionIds: result.startedSessionIds
|
||||
});
|
||||
}
|
||||
}).catch(error => {
|
||||
logger.warn('SYSTEM', 'Auto-recovery of pending queues failed', {}, error as Error);
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('SYSTEM', 'Background initialization failed', {}, error as Error);
|
||||
// Don't resolve - let the promise remain pending so readiness check continues to fail
|
||||
@@ -686,6 +689,45 @@ export class WorkerService {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Start a session processor
|
||||
* It will run continuously until the session is deleted/aborted
|
||||
*/
|
||||
private startSessionProcessor(
|
||||
session: ReturnType<typeof this.sessionManager.getSession>,
|
||||
source: string
|
||||
): void {
|
||||
if (!session) return;
|
||||
|
||||
const sid = session.sessionDbId;
|
||||
logger.info('SYSTEM', `Starting generator (${source})`, {
|
||||
sessionId: sid
|
||||
});
|
||||
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this)
|
||||
.catch(error => {
|
||||
// Only log if not aborted
|
||||
if (session.abortController.signal.aborted) return;
|
||||
|
||||
logger.error('SYSTEM', `Generator failed (${source})`, {
|
||||
sessionId: sid,
|
||||
error: error.message
|
||||
}, error);
|
||||
})
|
||||
.finally(() => {
|
||||
session.generatorPromise = null;
|
||||
this.broadcastProcessingStatus();
|
||||
|
||||
// Crash recovery: if not aborted, check if we should restart
|
||||
if (!session.abortController.signal.aborted) {
|
||||
// We can check if there are pending messages to decide if restart is urgent
|
||||
// But generally, if it crashed, we might want to restart?
|
||||
// For now, let's just log. The user/system can trigger restart if needed.
|
||||
logger.warn('SYSTEM', `Session processor exited unexpectedly`, { sessionId: sid });
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Process pending session queues
|
||||
* Starts SDK agents for sessions that have pending messages but no active processor
|
||||
@@ -738,11 +780,7 @@ export class WorkerService {
|
||||
});
|
||||
|
||||
// Start SDK agent (non-blocking)
|
||||
session.generatorPromise = this.sdkAgent.startSession(session, this)
|
||||
.finally(() => {
|
||||
session.generatorPromise = null;
|
||||
this.broadcastProcessingStatus();
|
||||
});
|
||||
this.startSessionProcessor(session, 'startup-recovery');
|
||||
|
||||
result.sessionsStarted++;
|
||||
result.startedSessionIds.push(sessionDbId);
|
||||
@@ -1030,7 +1068,13 @@ async function main() {
|
||||
|
||||
try {
|
||||
await httpShutdown(port);
|
||||
await waitForPortFree(port, getPlatformTimeout(15000));
|
||||
const freed = await waitForPortFree(port, getPlatformTimeout(15000));
|
||||
|
||||
if (!freed) {
|
||||
logger.warn('SYSTEM', 'Port did not free up after shutdown', { port });
|
||||
// Could force kill here if we knew the PID, but for now just warn
|
||||
}
|
||||
|
||||
removePidFile();
|
||||
releaseLock();
|
||||
logger.info('SYSTEM', 'Worker stopped successfully');
|
||||
@@ -1057,7 +1101,14 @@ async function main() {
|
||||
|
||||
try {
|
||||
await httpShutdown(port);
|
||||
await waitForPortFree(port, getPlatformTimeout(15000));
|
||||
const freed = await waitForPortFree(port, getPlatformTimeout(15000));
|
||||
|
||||
if (!freed) {
|
||||
releaseLock();
|
||||
logger.error('SYSTEM', 'Port did not free up after shutdown, aborting restart', { port });
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
removePidFile();
|
||||
|
||||
const child = spawn(process.execPath, [__filename, '--daemon'], {
|
||||
|
||||
@@ -19,8 +19,8 @@ export interface ConversationMessage {
|
||||
|
||||
export interface ActiveSession {
|
||||
sessionDbId: number;
|
||||
claudeSessionId: string;
|
||||
sdkSessionId: string | null;
|
||||
contentSessionId: string; // User's Claude Code session being observed
|
||||
memorySessionId: string | null; // Memory agent's session ID for resume
|
||||
project: string;
|
||||
userPrompt: string;
|
||||
pendingMessages: PendingMessage[]; // Deprecated: now using persistent store, kept for compatibility
|
||||
@@ -110,7 +110,7 @@ export interface ViewerSettings {
|
||||
|
||||
export interface Observation {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string; // Renamed from sdk_session_id
|
||||
project: string;
|
||||
type: string;
|
||||
title: string;
|
||||
@@ -128,7 +128,7 @@ export interface Observation {
|
||||
|
||||
export interface Summary {
|
||||
id: number;
|
||||
session_id: string; // claude_session_id (from JOIN)
|
||||
session_id: string; // content_session_id (from JOIN)
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
@@ -142,7 +142,7 @@ export interface Summary {
|
||||
|
||||
export interface UserPrompt {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string; // Renamed from claude_session_id
|
||||
project: string; // From JOIN with sdk_sessions
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
@@ -152,10 +152,10 @@ export interface UserPrompt {
|
||||
|
||||
export interface DBSession {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string; // Renamed from claude_session_id
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
sdk_session_id: string | null;
|
||||
memory_session_id: string | null; // Renamed from sdk_session_id
|
||||
status: 'active' | 'completed' | 'failed';
|
||||
started_at: string;
|
||||
started_at_epoch: number;
|
||||
|
||||
@@ -93,8 +93,8 @@ export class DatabaseManager {
|
||||
*/
|
||||
getSessionById(sessionDbId: number): {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string | null;
|
||||
content_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
} {
|
||||
|
||||
@@ -152,8 +152,8 @@ export class GeminiAgent {
|
||||
|
||||
// Build initial prompt
|
||||
const initPrompt = session.lastPromptNumber === 1
|
||||
? buildInitPrompt(session.project, session.claudeSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.claudeSessionId, mode);
|
||||
? buildInitPrompt(session.project, session.contentSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.contentSessionId, mode);
|
||||
|
||||
// Add to conversation history and query Gemini with full context
|
||||
session.conversationHistory.push({ role: 'user', content: initPrompt });
|
||||
@@ -224,7 +224,7 @@ export class GeminiAgent {
|
||||
// Build summary prompt
|
||||
const summaryPrompt = buildSummaryPrompt({
|
||||
id: session.sessionDbId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
project: session.project,
|
||||
user_prompt: session.userPrompt,
|
||||
last_user_message: message.last_user_message || '',
|
||||
@@ -374,12 +374,12 @@ export class GeminiAgent {
|
||||
originalTimestamp: number | null
|
||||
): Promise<void> {
|
||||
// Parse observations (same XML format)
|
||||
const observations = parseObservations(text, session.claudeSessionId);
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
|
||||
// Store observations with original timestamp (if processing backlog) or current time
|
||||
for (const obs of observations) {
|
||||
const { id: obsId, createdAtEpoch } = this.dbManager.getSessionStore().storeObservation(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -397,7 +397,7 @@ export class GeminiAgent {
|
||||
// Sync to Chroma
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -413,8 +413,8 @@ export class GeminiAgent {
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
session_id: session.claudeSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
@@ -447,7 +447,7 @@ export class GeminiAgent {
|
||||
};
|
||||
|
||||
const { id: summaryId, createdAtEpoch } = this.dbManager.getSessionStore().storeSummary(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
@@ -464,7 +464,7 @@ export class GeminiAgent {
|
||||
// Sync to Chroma
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
summaryId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
@@ -480,7 +480,7 @@ export class GeminiAgent {
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: summaryId,
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary.request,
|
||||
investigated: summary.investigated,
|
||||
learned: summary.learned,
|
||||
|
||||
@@ -112,8 +112,8 @@ export class OpenRouterAgent {
|
||||
|
||||
// Build initial prompt
|
||||
const initPrompt = session.lastPromptNumber === 1
|
||||
? buildInitPrompt(session.project, session.claudeSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.claudeSessionId, mode);
|
||||
? buildInitPrompt(session.project, session.contentSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.contentSessionId, mode);
|
||||
|
||||
// Add to conversation history and query OpenRouter with full context
|
||||
session.conversationHistory.push({ role: 'user', content: initPrompt });
|
||||
@@ -183,7 +183,7 @@ export class OpenRouterAgent {
|
||||
// Build summary prompt
|
||||
const summaryPrompt = buildSummaryPrompt({
|
||||
id: session.sessionDbId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
project: session.project,
|
||||
user_prompt: session.userPrompt,
|
||||
last_user_message: message.last_user_message || '',
|
||||
@@ -417,12 +417,12 @@ export class OpenRouterAgent {
|
||||
originalTimestamp: number | null
|
||||
): Promise<void> {
|
||||
// Parse observations (same XML format)
|
||||
const observations = parseObservations(text, session.claudeSessionId);
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
|
||||
// Store observations with original timestamp (if processing backlog) or current time
|
||||
for (const obs of observations) {
|
||||
const { id: obsId, createdAtEpoch } = this.dbManager.getSessionStore().storeObservation(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -440,7 +440,7 @@ export class OpenRouterAgent {
|
||||
// Sync to Chroma
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -456,8 +456,8 @@ export class OpenRouterAgent {
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
session_id: session.claudeSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
@@ -490,7 +490,7 @@ export class OpenRouterAgent {
|
||||
};
|
||||
|
||||
const { id: summaryId, createdAtEpoch } = this.dbManager.getSessionStore().storeSummary(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
@@ -507,7 +507,7 @@ export class OpenRouterAgent {
|
||||
// Sync to Chroma
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
summaryId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
@@ -523,7 +523,7 @@ export class OpenRouterAgent {
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: summaryId,
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary.request,
|
||||
investigated: summary.investigated,
|
||||
learned: summary.learned,
|
||||
|
||||
@@ -74,7 +74,7 @@ export class PaginationHelper {
|
||||
getObservations(offset: number, limit: number, project?: string): PaginatedResult<Observation> {
|
||||
const result = this.paginate<Observation>(
|
||||
'observations',
|
||||
'id, sdk_session_id, project, type, title, subtitle, narrative, text, facts, concepts, files_read, files_modified, prompt_number, created_at, created_at_epoch',
|
||||
'id, memory_session_id, project, type, title, subtitle, narrative, text, facts, concepts, files_read, files_modified, prompt_number, created_at, created_at_epoch',
|
||||
offset,
|
||||
limit,
|
||||
project
|
||||
@@ -96,7 +96,7 @@ export class PaginationHelper {
|
||||
let query = `
|
||||
SELECT
|
||||
ss.id,
|
||||
s.claude_session_id as session_id,
|
||||
s.content_session_id as session_id,
|
||||
ss.request,
|
||||
ss.investigated,
|
||||
ss.learned,
|
||||
@@ -106,7 +106,7 @@ export class PaginationHelper {
|
||||
ss.created_at,
|
||||
ss.created_at_epoch
|
||||
FROM session_summaries ss
|
||||
JOIN sdk_sessions s ON ss.sdk_session_id = s.sdk_session_id
|
||||
JOIN sdk_sessions s ON ss.memory_session_id = s.memory_session_id
|
||||
`;
|
||||
const params: any[] = [];
|
||||
|
||||
@@ -136,9 +136,9 @@ export class PaginationHelper {
|
||||
const db = this.dbManager.getSessionStore().db;
|
||||
|
||||
let query = `
|
||||
SELECT up.id, up.claude_session_id, s.project, up.prompt_number, up.prompt_text, up.created_at, up.created_at_epoch
|
||||
SELECT up.id, up.content_session_id, s.project, up.prompt_number, up.prompt_text, up.created_at, up.created_at_epoch
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
`;
|
||||
const params: any[] = [];
|
||||
|
||||
|
||||
@@ -66,17 +66,20 @@ export class SDKAgent {
|
||||
|
||||
logger.info('SDK', 'Starting SDK query', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
claudeSessionId: session.claudeSessionId,
|
||||
resume_parameter: session.claudeSessionId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
memorySessionId: session.memorySessionId,
|
||||
resume_parameter: session.memorySessionId || '(none - fresh start)',
|
||||
lastPromptNumber: session.lastPromptNumber
|
||||
});
|
||||
|
||||
// Run Agent SDK query loop
|
||||
// Use memorySessionId for resume (captured from previous SDK response) if available
|
||||
const queryResult = query({
|
||||
prompt: messageGenerator,
|
||||
options: {
|
||||
model: modelId,
|
||||
resume: session.claudeSessionId,
|
||||
// Only resume if we have a captured memory session ID from previous SDK interaction
|
||||
...(session.memorySessionId && { resume: session.memorySessionId }),
|
||||
disallowedTools,
|
||||
abortController: session.abortController,
|
||||
pathToClaudeCodeExecutable: claudePath
|
||||
@@ -85,6 +88,21 @@ export class SDKAgent {
|
||||
|
||||
// Process SDK messages
|
||||
for await (const message of queryResult) {
|
||||
// Capture memory session ID from first SDK message (any type has session_id)
|
||||
// This enables resume for subsequent generator starts within the same user session
|
||||
if (!session.memorySessionId && message.session_id) {
|
||||
session.memorySessionId = message.session_id;
|
||||
// Persist to database for cross-restart recovery
|
||||
this.dbManager.getSessionStore().updateMemorySessionId(
|
||||
session.sessionDbId,
|
||||
message.session_id
|
||||
);
|
||||
logger.info('SDK', 'Captured memory session ID', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
memorySessionId: message.session_id
|
||||
});
|
||||
}
|
||||
|
||||
// Handle assistant messages
|
||||
if (message.type === 'assistant') {
|
||||
const content = message.message.content;
|
||||
@@ -164,8 +182,8 @@ export class SDKAgent {
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
// Cleanup
|
||||
this.sessionManager.deleteSession(session.sessionDbId).catch(() => {});
|
||||
// NOTE: Do NOT delete session here - SessionRoutes.finally() handles cleanup
|
||||
// and auto-restart logic. Deleting here races with pending work checks.
|
||||
}
|
||||
}
|
||||
|
||||
@@ -184,7 +202,7 @@ export class SDKAgent {
|
||||
* - Continuation prompt for same session
|
||||
* - Includes session context and prompt number
|
||||
*
|
||||
* BOTH prompts receive session.claudeSessionId:
|
||||
* BOTH prompts receive session.contentSessionId:
|
||||
* - This comes from the hook's session_id (see new-hook.ts)
|
||||
* - Same session_id used by SAVE hook to store observations
|
||||
* - This is how everything stays connected in one unified session
|
||||
@@ -207,28 +225,28 @@ export class SDKAgent {
|
||||
const isInitPrompt = session.lastPromptNumber === 1;
|
||||
logger.info('SDK', 'Creating message generator', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
claudeSessionId: session.claudeSessionId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
lastPromptNumber: session.lastPromptNumber,
|
||||
isInitPrompt,
|
||||
promptType: isInitPrompt ? 'INIT' : 'CONTINUATION'
|
||||
});
|
||||
|
||||
const initPrompt = isInitPrompt
|
||||
? buildInitPrompt(session.project, session.claudeSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.claudeSessionId, mode);
|
||||
? buildInitPrompt(session.project, session.contentSessionId, session.userPrompt, mode)
|
||||
: buildContinuationPrompt(session.userPrompt, session.lastPromptNumber, session.contentSessionId, mode);
|
||||
|
||||
// Add to shared conversation history for provider interop
|
||||
session.conversationHistory.push({ role: 'user', content: initPrompt });
|
||||
|
||||
// Yield initial user prompt with context (or continuation if prompt #2+)
|
||||
// CRITICAL: Both paths use session.claudeSessionId from the hook
|
||||
// CRITICAL: Both paths use session.contentSessionId from the hook
|
||||
yield {
|
||||
type: 'user',
|
||||
message: {
|
||||
role: 'user',
|
||||
content: initPrompt
|
||||
},
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
parent_tool_use_id: null,
|
||||
isSynthetic: true
|
||||
};
|
||||
@@ -259,14 +277,14 @@ export class SDKAgent {
|
||||
role: 'user',
|
||||
content: obsPrompt
|
||||
},
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
parent_tool_use_id: null,
|
||||
isSynthetic: true
|
||||
};
|
||||
} else if (message.type === 'summarize') {
|
||||
const summaryPrompt = buildSummaryPrompt({
|
||||
id: session.sessionDbId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
project: session.project,
|
||||
user_prompt: session.userPrompt,
|
||||
last_user_message: message.last_user_message || '',
|
||||
@@ -282,7 +300,7 @@ export class SDKAgent {
|
||||
role: 'user',
|
||||
content: summaryPrompt
|
||||
},
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
parent_tool_use_id: null,
|
||||
isSynthetic: true
|
||||
};
|
||||
@@ -305,12 +323,12 @@ export class SDKAgent {
|
||||
}
|
||||
|
||||
// Parse observations
|
||||
const observations = parseObservations(text, session.claudeSessionId);
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
|
||||
// Store observations with original timestamp (if processing backlog) or current time
|
||||
for (const obs of observations) {
|
||||
const { id: obsId, createdAtEpoch } = this.dbManager.getSessionStore().storeObservation(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -335,7 +353,7 @@ export class SDKAgent {
|
||||
const obsTitle = obs.title || '(untitled)';
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
@@ -363,8 +381,8 @@ export class SDKAgent {
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
sdk_session_id: session.sdkSessionId,
|
||||
session_id: session.claudeSessionId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
@@ -388,7 +406,7 @@ export class SDKAgent {
|
||||
// Store summary with original timestamp (if processing backlog) or current time
|
||||
if (summary) {
|
||||
const { id: summaryId, createdAtEpoch } = this.dbManager.getSessionStore().storeSummary(
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summary,
|
||||
session.lastPromptNumber,
|
||||
@@ -410,7 +428,7 @@ export class SDKAgent {
|
||||
const summaryRequest = summary.request || '(no request)';
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
summaryId,
|
||||
session.claudeSessionId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summary,
|
||||
session.lastPromptNumber,
|
||||
@@ -436,7 +454,7 @@ export class SDKAgent {
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: summaryId,
|
||||
session_id: session.claudeSessionId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary.request,
|
||||
investigated: summary.investigated,
|
||||
learned: summary.learned,
|
||||
|
||||
@@ -13,6 +13,7 @@ import { DatabaseManager } from './DatabaseManager.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import type { ActiveSession, PendingMessage, PendingMessageWithId, ObservationData } from '../worker-types.js';
|
||||
import { PendingMessageStore } from '../sqlite/PendingMessageStore.js';
|
||||
import { SessionQueueProcessor } from '../queue/SessionQueueProcessor.js';
|
||||
|
||||
export class SessionManager {
|
||||
private dbManager: DatabaseManager;
|
||||
@@ -58,7 +59,7 @@ export class SessionManager {
|
||||
if (session) {
|
||||
logger.info('SESSION', 'Returning cached session', {
|
||||
sessionDbId,
|
||||
claudeSessionId: session.claudeSessionId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
lastPromptNumber: session.lastPromptNumber
|
||||
});
|
||||
|
||||
@@ -100,8 +101,8 @@ export class SessionManager {
|
||||
|
||||
logger.info('SESSION', 'Fetched session from database', {
|
||||
sessionDbId,
|
||||
claude_session_id: dbSession.claude_session_id,
|
||||
sdk_session_id: dbSession.sdk_session_id
|
||||
content_session_id: dbSession.content_session_id,
|
||||
memory_session_id: dbSession.memory_session_id
|
||||
});
|
||||
|
||||
// Use currentUserPrompt if provided, otherwise fall back to database (first prompt)
|
||||
@@ -122,16 +123,17 @@ export class SessionManager {
|
||||
}
|
||||
|
||||
// Create active session
|
||||
// Load memorySessionId from database if previously captured (enables resume across restarts)
|
||||
session = {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession.claude_session_id,
|
||||
sdkSessionId: null,
|
||||
contentSessionId: dbSession.content_session_id,
|
||||
memorySessionId: dbSession.memory_session_id || null,
|
||||
project: dbSession.project,
|
||||
userPrompt,
|
||||
pendingMessages: [],
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: promptNumber || this.dbManager.getSessionStore().getPromptNumberFromUserPrompts(dbSession.claude_session_id),
|
||||
lastPromptNumber: promptNumber || this.dbManager.getSessionStore().getPromptNumberFromUserPrompts(dbSession.content_session_id),
|
||||
startTime: Date.now(),
|
||||
cumulativeInputTokens: 0,
|
||||
cumulativeOutputTokens: 0,
|
||||
@@ -143,8 +145,9 @@ export class SessionManager {
|
||||
|
||||
logger.info('SESSION', 'Creating new session object', {
|
||||
sessionDbId,
|
||||
claudeSessionId: dbSession.claude_session_id,
|
||||
lastPromptNumber: promptNumber || this.dbManager.getSessionStore().getPromptNumberFromUserPrompts(dbSession.claude_session_id)
|
||||
contentSessionId: dbSession.content_session_id,
|
||||
memorySessionId: dbSession.memory_session_id || '(none - fresh session)',
|
||||
lastPromptNumber: promptNumber || this.dbManager.getSessionStore().getPromptNumberFromUserPrompts(dbSession.content_session_id)
|
||||
});
|
||||
|
||||
this.sessions.set(sessionDbId, session);
|
||||
@@ -156,7 +159,7 @@ export class SessionManager {
|
||||
logger.info('SESSION', 'Session initialized', {
|
||||
sessionId: sessionDbId,
|
||||
project: session.project,
|
||||
claudeSessionId: session.claudeSessionId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
queueDepth: 0,
|
||||
hasGenerator: false
|
||||
});
|
||||
@@ -185,8 +188,6 @@ export class SessionManager {
|
||||
session = this.initializeSession(sessionDbId);
|
||||
}
|
||||
|
||||
const beforeDepth = session.pendingMessages.length;
|
||||
|
||||
// CRITICAL: Persist to database FIRST
|
||||
const message: PendingMessage = {
|
||||
type: 'observation',
|
||||
@@ -198,7 +199,7 @@ export class SessionManager {
|
||||
};
|
||||
|
||||
try {
|
||||
const messageId = this.getPendingStore().enqueue(sessionDbId, session.claudeSessionId, message);
|
||||
const messageId = this.getPendingStore().enqueue(sessionDbId, session.contentSessionId, message);
|
||||
logger.debug('SESSION', `Observation persisted to DB`, {
|
||||
sessionId: sessionDbId,
|
||||
messageId,
|
||||
@@ -212,11 +213,6 @@ export class SessionManager {
|
||||
throw error; // Don't continue if we can't persist
|
||||
}
|
||||
|
||||
// Add to in-memory queue (for backward compatibility with existing iterator)
|
||||
session.pendingMessages.push(message);
|
||||
|
||||
const afterDepth = session.pendingMessages.length;
|
||||
|
||||
// Notify generator immediately (zero latency)
|
||||
const emitter = this.sessionQueues.get(sessionDbId);
|
||||
emitter?.emit('message');
|
||||
@@ -224,7 +220,7 @@ export class SessionManager {
|
||||
// Format tool name for logging
|
||||
const toolSummary = logger.formatTool(data.tool_name, data.tool_input);
|
||||
|
||||
logger.info('SESSION', `Observation queued (${beforeDepth}→${afterDepth})`, {
|
||||
logger.info('SESSION', `Observation queued`, {
|
||||
sessionId: sessionDbId,
|
||||
tool: toolSummary,
|
||||
hasGenerator: !!session.generatorPromise
|
||||
@@ -245,8 +241,6 @@ export class SessionManager {
|
||||
session = this.initializeSession(sessionDbId);
|
||||
}
|
||||
|
||||
const beforeDepth = session.pendingMessages.length;
|
||||
|
||||
// CRITICAL: Persist to database FIRST
|
||||
const message: PendingMessage = {
|
||||
type: 'summarize',
|
||||
@@ -255,7 +249,7 @@ export class SessionManager {
|
||||
};
|
||||
|
||||
try {
|
||||
const messageId = this.getPendingStore().enqueue(sessionDbId, session.claudeSessionId, message);
|
||||
const messageId = this.getPendingStore().enqueue(sessionDbId, session.contentSessionId, message);
|
||||
logger.debug('SESSION', `Summarize persisted to DB`, {
|
||||
sessionId: sessionDbId,
|
||||
messageId
|
||||
@@ -267,15 +261,10 @@ export class SessionManager {
|
||||
throw error; // Don't continue if we can't persist
|
||||
}
|
||||
|
||||
// Add to in-memory queue (for backward compatibility with existing iterator)
|
||||
session.pendingMessages.push(message);
|
||||
|
||||
const afterDepth = session.pendingMessages.length;
|
||||
|
||||
const emitter = this.sessionQueues.get(sessionDbId);
|
||||
emitter?.emit('message');
|
||||
|
||||
logger.info('SESSION', `Summarize queued (${beforeDepth}→${afterDepth})`, {
|
||||
logger.info('SESSION', `Summarize queued`, {
|
||||
sessionId: sessionDbId,
|
||||
hasGenerator: !!session.generatorPromise
|
||||
});
|
||||
@@ -328,9 +317,7 @@ export class SessionManager {
|
||||
* Check if any session has pending messages (for spinner tracking)
|
||||
*/
|
||||
hasPendingMessages(): boolean {
|
||||
return Array.from(this.sessions.values()).some(
|
||||
session => session.pendingMessages.length > 0
|
||||
);
|
||||
return this.getPendingStore().hasAnyPendingWork();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -345,8 +332,9 @@ export class SessionManager {
|
||||
*/
|
||||
getTotalQueueDepth(): number {
|
||||
let total = 0;
|
||||
// We can iterate over active sessions to get their pending count
|
||||
for (const session of this.sessions.values()) {
|
||||
total += session.pendingMessages.length;
|
||||
total += this.getPendingStore().getPendingCount(session.sessionDbId);
|
||||
}
|
||||
return total;
|
||||
}
|
||||
@@ -356,16 +344,8 @@ export class SessionManager {
|
||||
* Counts both pending messages and items actively being processed by SDK agents
|
||||
*/
|
||||
getTotalActiveWork(): number {
|
||||
let total = 0;
|
||||
for (const session of this.sessions.values()) {
|
||||
// Count queued messages
|
||||
total += session.pendingMessages.length;
|
||||
// Count currently processing item (1 per active generator)
|
||||
if (session.generatorPromise !== null) {
|
||||
total += 1;
|
||||
}
|
||||
}
|
||||
return total;
|
||||
// getPendingCount includes 'processing' status, so this IS the total active work
|
||||
return this.getTotalQueueDepth();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -373,17 +353,8 @@ export class SessionManager {
|
||||
* Used for activity indicator to prevent spinner from stopping while SDK is processing
|
||||
*/
|
||||
isAnySessionProcessing(): boolean {
|
||||
for (const session of this.sessions.values()) {
|
||||
// Has queued messages waiting to be processed
|
||||
if (session.pendingMessages.length > 0) {
|
||||
return true;
|
||||
}
|
||||
// Has active SDK generator running (processing dequeued messages)
|
||||
if (session.generatorPromise !== null) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
// hasAnyPendingWork checks for 'pending' OR 'processing'
|
||||
return this.getPendingStore().hasAnyPendingWork();
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -406,77 +377,22 @@ export class SessionManager {
|
||||
throw new Error(`No emitter for session ${sessionDbId}`);
|
||||
}
|
||||
|
||||
while (!session.abortController.signal.aborted) {
|
||||
// Check for pending messages in persistent store
|
||||
const persistentMessage = this.getPendingStore().peekPending(sessionDbId);
|
||||
|
||||
if (!persistentMessage) {
|
||||
// Wait for new message event
|
||||
await new Promise<void>(resolve => {
|
||||
const messageHandler = () => {
|
||||
emitter.off('message', messageHandler);
|
||||
resolve();
|
||||
};
|
||||
|
||||
const abortHandler = () => {
|
||||
emitter.off('message', messageHandler);
|
||||
resolve();
|
||||
};
|
||||
|
||||
emitter.once('message', messageHandler);
|
||||
session.abortController.signal.addEventListener('abort', abortHandler, { once: true });
|
||||
});
|
||||
|
||||
// Re-check for messages after waking up (handles race condition)
|
||||
const recheckMessage = this.getPendingStore().peekPending(sessionDbId);
|
||||
if (recheckMessage) {
|
||||
continue; // Got a message, process it
|
||||
}
|
||||
|
||||
// Woke up due to abort
|
||||
if (session.abortController.signal.aborted) {
|
||||
logger.info('SESSION', 'Generator exiting due to abort', { sessionId: sessionDbId });
|
||||
return;
|
||||
}
|
||||
|
||||
continue;
|
||||
}
|
||||
|
||||
// Mark as processing BEFORE yielding (status: pending -> processing)
|
||||
this.getPendingStore().markProcessing(persistentMessage.id);
|
||||
|
||||
const processor = new SessionQueueProcessor(this.getPendingStore(), emitter);
|
||||
|
||||
// Use the robust Pump iterator
|
||||
for await (const message of processor.createIterator(sessionDbId, session.abortController.signal)) {
|
||||
// Track this message ID for completion marking
|
||||
session.pendingProcessingIds.add(persistentMessage.id);
|
||||
session.pendingProcessingIds.add(message._persistentId);
|
||||
|
||||
// Track earliest timestamp for accurate observation timestamps
|
||||
// This ensures backlog messages get their original timestamps, not current time
|
||||
if (session.earliestPendingTimestamp === null) {
|
||||
session.earliestPendingTimestamp = persistentMessage.created_at_epoch;
|
||||
session.earliestPendingTimestamp = message._originalTimestamp;
|
||||
} else {
|
||||
session.earliestPendingTimestamp = Math.min(session.earliestPendingTimestamp, persistentMessage.created_at_epoch);
|
||||
session.earliestPendingTimestamp = Math.min(session.earliestPendingTimestamp, message._originalTimestamp);
|
||||
}
|
||||
|
||||
// Convert to PendingMessageWithId and yield
|
||||
// Include original timestamp for accurate observation timestamps (survives stuck processing)
|
||||
const message: PendingMessageWithId = {
|
||||
_persistentId: persistentMessage.id,
|
||||
_originalTimestamp: persistentMessage.created_at_epoch,
|
||||
...this.getPendingStore().toPendingMessage(persistentMessage)
|
||||
};
|
||||
|
||||
// Also add to in-memory queue for backward compatibility (status tracking)
|
||||
session.pendingMessages.push(message);
|
||||
|
||||
yield message;
|
||||
|
||||
// Remove from in-memory queue after yielding
|
||||
session.pendingMessages.shift();
|
||||
|
||||
// If we just yielded a summary, that's the end of this batch - stop the iterator
|
||||
if (message.type === 'summarize') {
|
||||
logger.info('SESSION', `Summary yielded - ending generator`, { sessionId: sessionDbId });
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ export class SessionEventBroadcaster {
|
||||
*/
|
||||
broadcastNewPrompt(prompt: {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
|
||||
@@ -74,9 +74,12 @@ export abstract class BaseRouteHandler {
|
||||
|
||||
/**
|
||||
* Centralized error logging and response
|
||||
* Checks headersSent to avoid "Cannot set headers after they are sent" errors
|
||||
*/
|
||||
protected handleError(res: Response, error: Error, context?: string): void {
|
||||
logger.failure('WORKER', context || 'Request failed', {}, error);
|
||||
res.status(500).json({ error: error.message });
|
||||
if (!res.headersSent) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -158,18 +158,18 @@ export class DataRoutes extends BaseRouteHandler {
|
||||
/**
|
||||
* Get SDK sessions by SDK session IDs
|
||||
* POST /api/sdk-sessions/batch
|
||||
* Body: { sdkSessionIds: string[] }
|
||||
* Body: { memorySessionIds: string[] }
|
||||
*/
|
||||
private handleGetSdkSessionsByIds = this.wrapHandler((req: Request, res: Response): void => {
|
||||
const { sdkSessionIds } = req.body;
|
||||
const { memorySessionIds } = req.body;
|
||||
|
||||
if (!Array.isArray(sdkSessionIds)) {
|
||||
this.badRequest(res, 'sdkSessionIds must be an array');
|
||||
if (!Array.isArray(memorySessionIds)) {
|
||||
this.badRequest(res, 'memorySessionIds must be an array');
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
const sessions = store.getSdkSessionsBySessionIds(sdkSessionIds);
|
||||
const sessions = store.getSdkSessionsBySessionIds(memorySessionIds);
|
||||
res.json(sessions);
|
||||
});
|
||||
|
||||
|
||||
@@ -45,7 +45,7 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
* Get the appropriate agent based on settings
|
||||
* Throws error if provider is selected but not configured (no silent fallback)
|
||||
*
|
||||
* Note: Session linking via claudeSessionId allows provider switching mid-session.
|
||||
* Note: Session linking via contentSessionId allows provider switching mid-session.
|
||||
* The conversationHistory on ActiveSession maintains context across providers.
|
||||
*/
|
||||
private getActiveAgent(): SDKAgent | GeminiAgent | OpenRouterAgent {
|
||||
@@ -136,17 +136,75 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
|
||||
session.generatorPromise = agent.startSession(session, this.workerService)
|
||||
.catch(error => {
|
||||
// Only log non-abort errors
|
||||
if (session.abortController.signal.aborted) return;
|
||||
|
||||
logger.error('SESSION', `Generator failed`, {
|
||||
sessionId: session.sessionDbId,
|
||||
provider: provider,
|
||||
error: error.message
|
||||
}, error);
|
||||
|
||||
// Mark all processing messages as failed so they can be retried or abandoned
|
||||
const pendingStore = this.sessionManager.getPendingMessageStore();
|
||||
const db = this.dbManager.getSessionStore().db;
|
||||
try {
|
||||
const stmt = db.prepare(`
|
||||
SELECT id FROM pending_messages
|
||||
WHERE session_db_id = ? AND status = 'processing'
|
||||
`);
|
||||
const processingMessages = stmt.all(session.sessionDbId) as { id: number }[];
|
||||
|
||||
for (const msg of processingMessages) {
|
||||
pendingStore.markFailed(msg.id);
|
||||
logger.warn('SESSION', `Marked message as failed after generator error`, {
|
||||
sessionId: session.sessionDbId,
|
||||
messageId: msg.id
|
||||
});
|
||||
}
|
||||
} catch (dbError) {
|
||||
logger.error('SESSION', 'Failed to mark messages as failed', { sessionId: session.sessionDbId }, dbError as Error);
|
||||
}
|
||||
})
|
||||
.finally(() => {
|
||||
logger.info('SESSION', `Generator finished`, { sessionId: session.sessionDbId });
|
||||
const sessionDbId = session.sessionDbId;
|
||||
|
||||
if (session.abortController.signal.aborted) {
|
||||
logger.info('SESSION', `Generator aborted`, { sessionId: sessionDbId });
|
||||
} else {
|
||||
logger.warn('SESSION', `Generator exited unexpectedly`, { sessionId: sessionDbId });
|
||||
}
|
||||
|
||||
session.generatorPromise = null;
|
||||
session.currentProvider = null;
|
||||
this.workerService.broadcastProcessingStatus();
|
||||
|
||||
// Crash recovery: If not aborted and still has work, restart
|
||||
if (!session.abortController.signal.aborted) {
|
||||
try {
|
||||
const pendingStore = this.sessionManager.getPendingMessageStore();
|
||||
const pendingCount = pendingStore.getPendingCount(sessionDbId);
|
||||
|
||||
if (pendingCount > 0) {
|
||||
logger.info('SESSION', `Restarting generator after crash/exit with pending work`, {
|
||||
sessionId: sessionDbId,
|
||||
pendingCount
|
||||
});
|
||||
// Small delay before restart
|
||||
setTimeout(() => {
|
||||
const stillExists = this.sessionManager.getSession(sessionDbId);
|
||||
if (stillExists && !stillExists.generatorPromise) {
|
||||
this.startGeneratorWithProvider(stillExists, this.getSelectedProvider(), 'crash-recovery');
|
||||
}
|
||||
}, 1000);
|
||||
}
|
||||
} catch (e) {
|
||||
// Ignore errors during recovery check
|
||||
}
|
||||
}
|
||||
// NOTE: We do NOT delete the session here anymore.
|
||||
// The generator waits for events, so if it exited, it's either aborted or crashed.
|
||||
// Idle sessions stay in memory (ActiveSession is small) to listen for future events.
|
||||
});
|
||||
}
|
||||
|
||||
@@ -159,7 +217,7 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
app.delete('/sessions/:sessionDbId', this.handleSessionDelete.bind(this));
|
||||
app.post('/sessions/:sessionDbId/complete', this.handleSessionComplete.bind(this));
|
||||
|
||||
// New session endpoints (use claudeSessionId)
|
||||
// New session endpoints (use contentSessionId)
|
||||
app.post('/api/sessions/init', this.handleSessionInitByClaudeId.bind(this));
|
||||
app.post('/api/sessions/observations', this.handleObservationsByClaudeId.bind(this));
|
||||
app.post('/api/sessions/summarize', this.handleSummarizeByClaudeId.bind(this));
|
||||
@@ -182,13 +240,13 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
const session = this.sessionManager.initializeSession(sessionDbId, userPrompt, promptNumber);
|
||||
|
||||
// Get the latest user_prompt for this session to sync to Chroma
|
||||
const latestPrompt = this.dbManager.getSessionStore().getLatestUserPrompt(session.claudeSessionId);
|
||||
const latestPrompt = this.dbManager.getSessionStore().getLatestUserPrompt(session.contentSessionId);
|
||||
|
||||
// Broadcast new prompt to SSE clients (for web UI)
|
||||
if (latestPrompt) {
|
||||
this.eventBroadcaster.broadcastNewPrompt({
|
||||
id: latestPrompt.id,
|
||||
claude_session_id: latestPrompt.claude_session_id,
|
||||
content_session_id: latestPrompt.content_session_id,
|
||||
project: latestPrompt.project,
|
||||
prompt_number: latestPrompt.prompt_number,
|
||||
prompt_text: latestPrompt.prompt_text,
|
||||
@@ -200,7 +258,7 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
const promptText = latestPrompt.prompt_text;
|
||||
this.dbManager.getChromaSync().syncUserPrompt(
|
||||
latestPrompt.id,
|
||||
latestPrompt.sdk_session_id,
|
||||
latestPrompt.memory_session_id,
|
||||
latestPrompt.project,
|
||||
promptText,
|
||||
latestPrompt.prompt_number,
|
||||
@@ -329,15 +387,15 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
});
|
||||
|
||||
/**
|
||||
* Queue observations by claudeSessionId (post-tool-use-hook uses this)
|
||||
* Queue observations by contentSessionId (post-tool-use-hook uses this)
|
||||
* POST /api/sessions/observations
|
||||
* Body: { claudeSessionId, tool_name, tool_input, tool_response, cwd }
|
||||
* Body: { contentSessionId, tool_name, tool_input, tool_response, cwd }
|
||||
*/
|
||||
private handleObservationsByClaudeId = this.wrapHandler((req: Request, res: Response): void => {
|
||||
const { claudeSessionId, tool_name, tool_input, tool_response, cwd } = req.body;
|
||||
const { contentSessionId, tool_name, tool_input, tool_response, cwd } = req.body;
|
||||
|
||||
if (!claudeSessionId) {
|
||||
return this.badRequest(res, 'Missing claudeSessionId');
|
||||
if (!contentSessionId) {
|
||||
return this.badRequest(res, 'Missing contentSessionId');
|
||||
}
|
||||
|
||||
// Load skip tools from settings
|
||||
@@ -368,13 +426,13 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Get or create session
|
||||
const sessionDbId = store.createSDKSession(claudeSessionId, '', '');
|
||||
const promptNumber = store.getPromptNumberFromUserPrompts(claudeSessionId);
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, '', '');
|
||||
const promptNumber = store.getPromptNumberFromUserPrompts(contentSessionId);
|
||||
|
||||
// Privacy check: skip if user prompt was entirely private
|
||||
const userPrompt = PrivacyCheckValidator.checkUserPromptPrivacy(
|
||||
store,
|
||||
claudeSessionId,
|
||||
contentSessionId,
|
||||
promptNumber,
|
||||
'observation',
|
||||
sessionDbId,
|
||||
@@ -419,29 +477,29 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
});
|
||||
|
||||
/**
|
||||
* Queue summarize by claudeSessionId (summary-hook uses this)
|
||||
* Queue summarize by contentSessionId (summary-hook uses this)
|
||||
* POST /api/sessions/summarize
|
||||
* Body: { claudeSessionId, last_user_message, last_assistant_message }
|
||||
* Body: { contentSessionId, last_user_message, last_assistant_message }
|
||||
*
|
||||
* Checks privacy, queues summarize request for SDK agent
|
||||
*/
|
||||
private handleSummarizeByClaudeId = this.wrapHandler((req: Request, res: Response): void => {
|
||||
const { claudeSessionId, last_user_message, last_assistant_message } = req.body;
|
||||
const { contentSessionId, last_user_message, last_assistant_message } = req.body;
|
||||
|
||||
if (!claudeSessionId) {
|
||||
return this.badRequest(res, 'Missing claudeSessionId');
|
||||
if (!contentSessionId) {
|
||||
return this.badRequest(res, 'Missing contentSessionId');
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Get or create session
|
||||
const sessionDbId = store.createSDKSession(claudeSessionId, '', '');
|
||||
const promptNumber = store.getPromptNumberFromUserPrompts(claudeSessionId);
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, '', '');
|
||||
const promptNumber = store.getPromptNumberFromUserPrompts(contentSessionId);
|
||||
|
||||
// Privacy check: skip if user prompt was entirely private
|
||||
const userPrompt = PrivacyCheckValidator.checkUserPromptPrivacy(
|
||||
store,
|
||||
claudeSessionId,
|
||||
contentSessionId,
|
||||
promptNumber,
|
||||
'summarize',
|
||||
sessionDbId
|
||||
@@ -474,9 +532,9 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
});
|
||||
|
||||
/**
|
||||
* Initialize session by claudeSessionId (new-hook uses this)
|
||||
* Initialize session by contentSessionId (new-hook uses this)
|
||||
* POST /api/sessions/init
|
||||
* Body: { claudeSessionId, project, prompt }
|
||||
* Body: { contentSessionId, project, prompt }
|
||||
*
|
||||
* Performs all session initialization DB operations:
|
||||
* - Creates/gets SDK session (idempotent)
|
||||
@@ -486,31 +544,31 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
* Returns: { sessionDbId, promptNumber, skipped: boolean, reason?: string }
|
||||
*/
|
||||
private handleSessionInitByClaudeId = this.wrapHandler((req: Request, res: Response): void => {
|
||||
const { claudeSessionId, project, prompt } = req.body;
|
||||
const { contentSessionId, project, prompt } = req.body;
|
||||
|
||||
logger.info('HTTP', 'SessionRoutes: handleSessionInitByClaudeId called', {
|
||||
claudeSessionId,
|
||||
contentSessionId,
|
||||
project,
|
||||
prompt_length: prompt?.length
|
||||
});
|
||||
|
||||
// Validate required parameters
|
||||
if (!this.validateRequired(req, res, ['claudeSessionId', 'project', 'prompt'])) {
|
||||
if (!this.validateRequired(req, res, ['contentSessionId', 'project', 'prompt'])) {
|
||||
return;
|
||||
}
|
||||
|
||||
const store = this.dbManager.getSessionStore();
|
||||
|
||||
// Step 1: Create/get SDK session (idempotent INSERT OR IGNORE)
|
||||
const sessionDbId = store.createSDKSession(claudeSessionId, project, prompt);
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, project, prompt);
|
||||
|
||||
logger.info('HTTP', 'SessionRoutes: createSDKSession returned', {
|
||||
sessionDbId,
|
||||
claudeSessionId
|
||||
contentSessionId
|
||||
});
|
||||
|
||||
// Step 2: Get next prompt number from user_prompts count
|
||||
const currentCount = store.getPromptNumberFromUserPrompts(claudeSessionId);
|
||||
const currentCount = store.getPromptNumberFromUserPrompts(contentSessionId);
|
||||
const promptNumber = currentCount + 1;
|
||||
|
||||
logger.info('HTTP', 'SessionRoutes: Calculated promptNumber', {
|
||||
@@ -540,7 +598,7 @@ export class SessionRoutes extends BaseRouteHandler {
|
||||
}
|
||||
|
||||
// Step 5: Save cleaned user prompt
|
||||
store.saveUserPrompt(claudeSessionId, promptNumber, cleanedPrompt);
|
||||
store.saveUserPrompt(contentSessionId, promptNumber, cleanedPrompt);
|
||||
|
||||
logger.info('SESSION', 'Session initialized via HTTP', {
|
||||
sessionId: sessionDbId,
|
||||
|
||||
@@ -12,20 +12,20 @@ export class PrivacyCheckValidator {
|
||||
* Check if user prompt is public (not entirely private)
|
||||
*
|
||||
* @param store - SessionStore instance
|
||||
* @param claudeSessionId - Claude session ID
|
||||
* @param contentSessionId - Claude session ID
|
||||
* @param promptNumber - Prompt number within session
|
||||
* @param operationType - Type of operation being validated ('observation' or 'summarize')
|
||||
* @returns User prompt text if public, null if private
|
||||
*/
|
||||
static checkUserPromptPrivacy(
|
||||
store: SessionStore,
|
||||
claudeSessionId: string,
|
||||
contentSessionId: string,
|
||||
promptNumber: number,
|
||||
operationType: 'observation' | 'summarize',
|
||||
sessionDbId: number,
|
||||
additionalContext?: Record<string, any>
|
||||
): string | null {
|
||||
const userPrompt = store.getUserPrompt(claudeSessionId, promptNumber);
|
||||
const userPrompt = store.getUserPrompt(contentSessionId, promptNumber);
|
||||
|
||||
if (!userPrompt || userPrompt.trim() === '') {
|
||||
logger.debug('HOOK', `Skipping ${operationType} - user prompt was entirely private`, {
|
||||
|
||||
@@ -45,8 +45,8 @@ export interface SchemaVersion {
|
||||
*/
|
||||
export interface SdkSessionRecord {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string | null;
|
||||
content_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string | null;
|
||||
started_at: string;
|
||||
@@ -63,7 +63,7 @@ export interface SdkSessionRecord {
|
||||
*/
|
||||
export interface ObservationRecord {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: 'decision' | 'bugfix' | 'feature' | 'refactor' | 'discovery' | 'change';
|
||||
@@ -81,7 +81,7 @@ export interface ObservationRecord {
|
||||
*/
|
||||
export interface SessionSummaryRecord {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
@@ -99,9 +99,10 @@ export interface SessionSummaryRecord {
|
||||
*/
|
||||
export interface UserPromptRecord {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project?: string; // From JOIN with sdk_sessions
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
@@ -111,8 +112,8 @@ export interface UserPromptRecord {
|
||||
*/
|
||||
export interface LatestPromptResult {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
sdk_session_id: string;
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
@@ -124,7 +125,7 @@ export interface LatestPromptResult {
|
||||
*/
|
||||
export interface ObservationWithContext {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: string;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
export interface Observation {
|
||||
id: number;
|
||||
sdk_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
type: string;
|
||||
title: string | null;
|
||||
@@ -30,7 +30,7 @@ export interface Summary {
|
||||
|
||||
export interface UserPrompt {
|
||||
id: number;
|
||||
claude_session_id: string;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
|
||||
+2
-2
@@ -19,7 +19,7 @@ export type Component = 'HOOK' | 'WORKER' | 'SDK' | 'PARSER' | 'DB' | 'SYSTEM' |
|
||||
|
||||
interface LogContext {
|
||||
sessionId?: number;
|
||||
sdkSessionId?: string;
|
||||
memorySessionId?: string;
|
||||
correlationId?: string;
|
||||
[key: string]: any;
|
||||
}
|
||||
@@ -253,7 +253,7 @@ class Logger {
|
||||
// Build additional context
|
||||
let contextStr = '';
|
||||
if (context) {
|
||||
const { sessionId, sdkSessionId, correlationId, ...rest } = context;
|
||||
const { sessionId, memorySessionId, correlationId, ...rest } = context;
|
||||
if (Object.keys(rest).length > 0) {
|
||||
const pairs = Object.entries(rest).map(([k, v]) => `${k}=${v}`);
|
||||
contextStr = ` {${pairs.join(', ')}}`;
|
||||
|
||||
@@ -0,0 +1,405 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'bun:test';
|
||||
import { SessionStore } from '../src/services/sqlite/SessionStore.js';
|
||||
|
||||
/**
|
||||
* Tests for Session ID Refactoring
|
||||
*
|
||||
* Validates the semantic renaming:
|
||||
* - claudeSessionId → contentSessionId (user's observed Claude Code session)
|
||||
* - sdkSessionId → memorySessionId (memory agent's session ID for resume)
|
||||
*
|
||||
* Also validates the memory session ID capture mechanism for resume functionality.
|
||||
*/
|
||||
describe('Session ID Refactor', () => {
|
||||
let store: SessionStore;
|
||||
|
||||
beforeEach(() => {
|
||||
store = new SessionStore(':memory:');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
store.close();
|
||||
});
|
||||
|
||||
describe('Database Migration 17 - Column Renaming', () => {
|
||||
it('should have content_session_id column in sdk_sessions table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(sdk_sessions)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('content_session_id');
|
||||
expect(columnNames).not.toContain('claude_session_id');
|
||||
});
|
||||
|
||||
it('should have memory_session_id column in sdk_sessions table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(sdk_sessions)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('memory_session_id');
|
||||
expect(columnNames).not.toContain('sdk_session_id');
|
||||
});
|
||||
|
||||
it('should have memory_session_id column in observations table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(observations)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('memory_session_id');
|
||||
expect(columnNames).not.toContain('sdk_session_id');
|
||||
});
|
||||
|
||||
it('should have memory_session_id column in session_summaries table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(session_summaries)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('memory_session_id');
|
||||
expect(columnNames).not.toContain('sdk_session_id');
|
||||
});
|
||||
|
||||
it('should have content_session_id column in user_prompts table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(user_prompts)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('content_session_id');
|
||||
expect(columnNames).not.toContain('claude_session_id');
|
||||
});
|
||||
|
||||
it('should have content_session_id column in pending_messages table', () => {
|
||||
const tableInfo = store.db.query('PRAGMA table_info(pending_messages)').all() as Array<{ name: string }>;
|
||||
const columnNames = tableInfo.map(col => col.name);
|
||||
|
||||
expect(columnNames).toContain('content_session_id');
|
||||
expect(columnNames).not.toContain('claude_session_id');
|
||||
});
|
||||
|
||||
it('should record migration 17 in schema_versions', () => {
|
||||
const result = store.db.prepare(
|
||||
'SELECT version FROM schema_versions WHERE version = 17'
|
||||
).get() as { version: number } | undefined;
|
||||
|
||||
expect(result).toBeDefined();
|
||||
expect(result?.version).toBe(17);
|
||||
});
|
||||
});
|
||||
|
||||
describe('createSDKSession - Session ID Initialization', () => {
|
||||
it('should create session with content_session_id set to the provided session ID', () => {
|
||||
const contentSessionId = 'user-claude-code-session-123';
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test prompt');
|
||||
|
||||
const session = store.db.prepare(
|
||||
'SELECT content_session_id FROM sdk_sessions WHERE id = ?'
|
||||
).get(sessionDbId) as { content_session_id: string };
|
||||
|
||||
expect(session.content_session_id).toBe(contentSessionId);
|
||||
});
|
||||
|
||||
it('should create session with memory_session_id initially equal to content_session_id', () => {
|
||||
const contentSessionId = 'user-session-456';
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test prompt');
|
||||
|
||||
const session = store.db.prepare(
|
||||
'SELECT content_session_id, memory_session_id FROM sdk_sessions WHERE id = ?'
|
||||
).get(sessionDbId) as { content_session_id: string; memory_session_id: string };
|
||||
|
||||
// Initially they're the same - memory_session_id gets updated when SDK responds
|
||||
expect(session.memory_session_id).toBe(contentSessionId);
|
||||
});
|
||||
|
||||
it('should be idempotent - return same ID for same content_session_id', () => {
|
||||
const contentSessionId = 'idempotent-test-session';
|
||||
|
||||
const id1 = store.createSDKSession(contentSessionId, 'project-1', 'First prompt');
|
||||
const id2 = store.createSDKSession(contentSessionId, 'project-2', 'Second prompt');
|
||||
|
||||
expect(id1).toBe(id2);
|
||||
|
||||
// Verify the original values are preserved (INSERT OR IGNORE)
|
||||
const session = store.db.prepare(
|
||||
'SELECT project, user_prompt FROM sdk_sessions WHERE id = ?'
|
||||
).get(id1) as { project: string; user_prompt: string };
|
||||
|
||||
expect(session.project).toBe('project-1');
|
||||
expect(session.user_prompt).toBe('First prompt');
|
||||
});
|
||||
});
|
||||
|
||||
describe('updateMemorySessionId - Memory Agent Session Capture', () => {
|
||||
it('should update memory_session_id for existing session', () => {
|
||||
const contentSessionId = 'content-session-789';
|
||||
const memorySessionId = 'sdk-generated-memory-session-abc';
|
||||
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
// Initially memory_session_id equals content_session_id
|
||||
const beforeUpdate = store.db.prepare(
|
||||
'SELECT memory_session_id FROM sdk_sessions WHERE id = ?'
|
||||
).get(sessionDbId) as { memory_session_id: string };
|
||||
expect(beforeUpdate.memory_session_id).toBe(contentSessionId);
|
||||
|
||||
// Update with SDK-captured memory session ID
|
||||
store.updateMemorySessionId(sessionDbId, memorySessionId);
|
||||
|
||||
// Verify it was updated
|
||||
const afterUpdate = store.db.prepare(
|
||||
'SELECT memory_session_id FROM sdk_sessions WHERE id = ?'
|
||||
).get(sessionDbId) as { memory_session_id: string };
|
||||
expect(afterUpdate.memory_session_id).toBe(memorySessionId);
|
||||
});
|
||||
|
||||
it('should allow updating memory_session_id multiple times', () => {
|
||||
const contentSessionId = 'multi-update-session';
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
store.updateMemorySessionId(sessionDbId, 'first-memory-id');
|
||||
store.updateMemorySessionId(sessionDbId, 'second-memory-id');
|
||||
|
||||
const session = store.db.prepare(
|
||||
'SELECT memory_session_id FROM sdk_sessions WHERE id = ?'
|
||||
).get(sessionDbId) as { memory_session_id: string };
|
||||
|
||||
expect(session.memory_session_id).toBe('second-memory-id');
|
||||
});
|
||||
});
|
||||
|
||||
describe('getSessionById - Session Retrieval', () => {
|
||||
it('should return session with both content_session_id and memory_session_id', () => {
|
||||
const contentSessionId = 'retrieve-test-session';
|
||||
const memorySessionId = 'captured-memory-id';
|
||||
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test prompt');
|
||||
store.updateMemorySessionId(sessionDbId, memorySessionId);
|
||||
|
||||
const session = store.getSessionById(sessionDbId);
|
||||
|
||||
expect(session).not.toBeNull();
|
||||
expect(session?.content_session_id).toBe(contentSessionId);
|
||||
expect(session?.memory_session_id).toBe(memorySessionId);
|
||||
});
|
||||
|
||||
it('should initialize memory_session_id to content_session_id before SDK capture', () => {
|
||||
const contentSessionId = 'never-captured-session';
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
// createSDKSession sets memory_session_id = content_session_id initially
|
||||
// The memory_session_id gets updated when SDK responds with its session ID
|
||||
const session = store.getSessionById(sessionDbId);
|
||||
expect(session?.memory_session_id).toBe(contentSessionId);
|
||||
});
|
||||
});
|
||||
|
||||
describe('storeObservation - Memory Session ID Reference', () => {
|
||||
it('should store observation with memory_session_id as foreign key', () => {
|
||||
const contentSessionId = 'obs-test-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
const obs = {
|
||||
type: 'discovery',
|
||||
title: 'Test Observation',
|
||||
subtitle: null,
|
||||
facts: ['Fact 1'],
|
||||
narrative: 'Testing memory session ID reference',
|
||||
concepts: ['testing'],
|
||||
files_read: [],
|
||||
files_modified: []
|
||||
};
|
||||
|
||||
const result = store.storeObservation(contentSessionId, 'test-project', obs, 1);
|
||||
|
||||
// Verify the observation was stored with memory_session_id
|
||||
const stored = store.db.prepare(
|
||||
'SELECT memory_session_id FROM observations WHERE id = ?'
|
||||
).get(result.id) as { memory_session_id: string };
|
||||
|
||||
expect(stored.memory_session_id).toBe(contentSessionId);
|
||||
});
|
||||
|
||||
it('should be retrievable by getObservationsForSession using memory_session_id', () => {
|
||||
const contentSessionId = 'obs-retrieval-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
const obs = {
|
||||
type: 'feature',
|
||||
title: 'New Feature',
|
||||
subtitle: 'Sub',
|
||||
facts: [],
|
||||
narrative: null,
|
||||
concepts: [],
|
||||
files_read: ['file1.ts'],
|
||||
files_modified: ['file2.ts']
|
||||
};
|
||||
|
||||
store.storeObservation(contentSessionId, 'test-project', obs, 1);
|
||||
|
||||
const observations = store.getObservationsForSession(contentSessionId);
|
||||
|
||||
expect(observations.length).toBe(1);
|
||||
expect(observations[0].title).toBe('New Feature');
|
||||
});
|
||||
});
|
||||
|
||||
describe('storeSummary - Memory Session ID Reference', () => {
|
||||
it('should store summary with memory_session_id as foreign key', () => {
|
||||
const contentSessionId = 'summary-test-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
const summary = {
|
||||
request: 'Test request',
|
||||
investigated: 'Investigated stuff',
|
||||
learned: 'Learned things',
|
||||
completed: 'Completed work',
|
||||
next_steps: 'Next steps here',
|
||||
notes: null
|
||||
};
|
||||
|
||||
const result = store.storeSummary(contentSessionId, 'test-project', summary, 1);
|
||||
|
||||
// Verify the summary was stored with memory_session_id
|
||||
const stored = store.db.prepare(
|
||||
'SELECT memory_session_id FROM session_summaries WHERE id = ?'
|
||||
).get(result.id) as { memory_session_id: string };
|
||||
|
||||
expect(stored.memory_session_id).toBe(contentSessionId);
|
||||
});
|
||||
|
||||
it('should be retrievable by getSummaryForSession using memory_session_id', () => {
|
||||
const contentSessionId = 'summary-retrieval-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
const summary = {
|
||||
request: 'My request',
|
||||
investigated: 'Investigation',
|
||||
learned: 'Learnings',
|
||||
completed: 'Completions',
|
||||
next_steps: 'Next',
|
||||
notes: 'Some notes'
|
||||
};
|
||||
|
||||
store.storeSummary(contentSessionId, 'test-project', summary, 1);
|
||||
|
||||
const retrieved = store.getSummaryForSession(contentSessionId);
|
||||
|
||||
expect(retrieved).not.toBeNull();
|
||||
expect(retrieved?.request).toBe('My request');
|
||||
expect(retrieved?.notes).toBe('Some notes');
|
||||
});
|
||||
});
|
||||
|
||||
describe('saveUserPrompt - Content Session ID Reference', () => {
|
||||
it('should store user prompt with content_session_id as foreign key', () => {
|
||||
const contentSessionId = 'prompt-test-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Initial');
|
||||
|
||||
const promptId = store.saveUserPrompt(contentSessionId, 1, 'First user prompt');
|
||||
|
||||
// Verify the prompt was stored with content_session_id
|
||||
const stored = store.db.prepare(
|
||||
'SELECT content_session_id FROM user_prompts WHERE id = ?'
|
||||
).get(promptId) as { content_session_id: string };
|
||||
|
||||
expect(stored.content_session_id).toBe(contentSessionId);
|
||||
});
|
||||
|
||||
it('should be countable by getPromptNumberFromUserPrompts using content_session_id', () => {
|
||||
const contentSessionId = 'prompt-count-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Initial');
|
||||
|
||||
expect(store.getPromptNumberFromUserPrompts(contentSessionId)).toBe(0);
|
||||
|
||||
store.saveUserPrompt(contentSessionId, 1, 'First');
|
||||
expect(store.getPromptNumberFromUserPrompts(contentSessionId)).toBe(1);
|
||||
|
||||
store.saveUserPrompt(contentSessionId, 2, 'Second');
|
||||
expect(store.getPromptNumberFromUserPrompts(contentSessionId)).toBe(2);
|
||||
});
|
||||
|
||||
it('should be retrievable by getUserPrompt using content_session_id', () => {
|
||||
const contentSessionId = 'prompt-retrieve-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Initial');
|
||||
|
||||
store.saveUserPrompt(contentSessionId, 1, 'Hello world');
|
||||
|
||||
const retrieved = store.getUserPrompt(contentSessionId, 1);
|
||||
|
||||
expect(retrieved).toBe('Hello world');
|
||||
});
|
||||
});
|
||||
|
||||
describe('getLatestUserPrompt - Joined Query with Both Session IDs', () => {
|
||||
it('should return prompt with both content_session_id and memory_session_id', () => {
|
||||
const contentSessionId = 'latest-prompt-session';
|
||||
const memorySessionId = 'captured-memory-for-latest';
|
||||
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'Initial');
|
||||
store.updateMemorySessionId(sessionDbId, memorySessionId);
|
||||
store.saveUserPrompt(contentSessionId, 1, 'Latest prompt text');
|
||||
|
||||
const latest = store.getLatestUserPrompt(contentSessionId);
|
||||
|
||||
expect(latest).toBeDefined();
|
||||
expect(latest?.content_session_id).toBe(contentSessionId);
|
||||
expect(latest?.memory_session_id).toBe(memorySessionId);
|
||||
expect(latest?.prompt_text).toBe('Latest prompt text');
|
||||
});
|
||||
});
|
||||
|
||||
describe('getAllRecentUserPrompts - Joined Query with Project', () => {
|
||||
it('should return prompts with content_session_id and project from session', () => {
|
||||
const contentSessionId = 'all-prompts-session';
|
||||
store.createSDKSession(contentSessionId, 'my-project', 'Initial');
|
||||
store.saveUserPrompt(contentSessionId, 1, 'Prompt one');
|
||||
store.saveUserPrompt(contentSessionId, 2, 'Prompt two');
|
||||
|
||||
const prompts = store.getAllRecentUserPrompts(10);
|
||||
|
||||
expect(prompts.length).toBe(2);
|
||||
expect(prompts[0].content_session_id).toBe(contentSessionId);
|
||||
expect(prompts[0].project).toBe('my-project');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Resume Functionality - Memory Session ID Usage', () => {
|
||||
it('should preserve memory_session_id across session re-initialization', () => {
|
||||
const contentSessionId = 'resume-test-session';
|
||||
const capturedMemoryId = 'sdk-memory-session-for-resume';
|
||||
|
||||
// Simulate first interaction: create session, then SDK responds with session ID
|
||||
const sessionDbId = store.createSDKSession(contentSessionId, 'test-project', 'First prompt');
|
||||
store.updateMemorySessionId(sessionDbId, capturedMemoryId);
|
||||
|
||||
// Simulate worker restart or new request: fetch session from database
|
||||
const retrievedSession = store.getSessionById(sessionDbId);
|
||||
|
||||
// The memory_session_id should be available for resume parameter
|
||||
expect(retrievedSession?.memory_session_id).toBe(capturedMemoryId);
|
||||
});
|
||||
|
||||
it('should support multiple observations linked to same memory_session_id', () => {
|
||||
const contentSessionId = 'multi-obs-session';
|
||||
store.createSDKSession(contentSessionId, 'test-project', 'Test');
|
||||
|
||||
// Store multiple observations
|
||||
for (let i = 1; i <= 5; i++) {
|
||||
store.storeObservation(contentSessionId, 'test-project', {
|
||||
type: 'discovery',
|
||||
title: `Observation ${i}`,
|
||||
subtitle: null,
|
||||
facts: [],
|
||||
narrative: null,
|
||||
concepts: [],
|
||||
files_read: [],
|
||||
files_modified: []
|
||||
}, i);
|
||||
}
|
||||
|
||||
const observations = store.getObservationsForSession(contentSessionId);
|
||||
expect(observations.length).toBe(5);
|
||||
|
||||
// All should have the same memory_session_id
|
||||
const directQuery = store.db.prepare(
|
||||
'SELECT DISTINCT memory_session_id FROM observations WHERE memory_session_id = ?'
|
||||
).all(contentSessionId) as Array<{ memory_session_id: string }>;
|
||||
|
||||
expect(directQuery.length).toBe(1);
|
||||
expect(directQuery[0].memory_session_id).toBe(contentSessionId);
|
||||
});
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user