Refactor worker management and cleanup hooks
- Removed ensureWorkerRunning calls from multiple hooks (cleanup, context, new, save, summary) to streamline code and avoid unnecessary checks. - Introduced fixed port usage for worker communication across hooks. - Enhanced error handling in newHook, saveHook, and summaryHook to provide clearer messages for worker connection issues. - Updated worker service to start without health checks, relying on PM2 for management. - Cached Claude executable path to optimize repeated calls. - Improved logging for better traceability of worker actions and errors.
This commit is contained in:
@@ -1,331 +0,0 @@
|
||||
# Experimental Release: Progressive Disclosure Context System
|
||||
|
||||
## 🧪 Branch: `feature/context-with-observations`
|
||||
|
||||
**Status:** Seeking user feedback before merging to main
|
||||
|
||||
**We'd love your testing and feedback!** This experimental branch reimagines how Claude-Mem presents context at session startup, using a progressive disclosure approach that could significantly improve Claude's ability to leverage past learnings.
|
||||
|
||||
---
|
||||
|
||||
## What is Progressive Disclosure?
|
||||
|
||||
Progressive disclosure is a **layered memory retrieval system** inspired by how humans remember information:
|
||||
|
||||
### Layer 1: Index (The "Table of Contents")
|
||||
**Frontloaded at session start** - Claude sees:
|
||||
- **What exists**: Titles of all recent observations and session summaries
|
||||
- **Retrieval cost**: Token counts for each observation
|
||||
- **Priority signals**: Type indicators (🔴 critical gotcha, 🟤 architectural decision, 🔵 explanatory)
|
||||
|
||||
### Layer 2: Details (On-Demand Retrieval)
|
||||
**Retrieved via MCP search** - Claude fetches:
|
||||
- Full observation narratives when deeper context is needed
|
||||
- Search by concept, file path, type, or keywords
|
||||
- Only loads what's relevant to the current task
|
||||
|
||||
### Layer 3: Perfect Recall (Source of Truth)
|
||||
**Direct code access** - When needed:
|
||||
- Read actual source files for implementation details
|
||||
- Access original transcripts for exact quotes
|
||||
- Full context without compression artifacts
|
||||
|
||||
---
|
||||
|
||||
## The Problem This Solves
|
||||
|
||||
### Current Version (v4.2.x) Limitation
|
||||
|
||||
The current context hook shows **only session summaries** at startup:
|
||||
|
||||
```markdown
|
||||
**Session #312**: Put date/time at end of session titles
|
||||
Completed: Added date/time to session list with proper formatting
|
||||
Next Steps: Test edge cases with long dates
|
||||
```
|
||||
|
||||
**Strengths:**
|
||||
- ✅ Minimal token overhead (~800 tokens)
|
||||
- ✅ Clean, readable summaries
|
||||
|
||||
**Weaknesses:**
|
||||
- ❌ Claude doesn't know **what** detailed observations exist
|
||||
- ❌ Can't make informed decisions about whether to search vs read code
|
||||
- ❌ Often re-reads code to understand decisions that were already documented
|
||||
|
||||
### Experimental Version Enhancement
|
||||
|
||||
The experimental hook shows an **observation index** alongside session summaries:
|
||||
|
||||
```markdown
|
||||
**src/hooks/context.ts**
|
||||
| ID | Time | T | Title | Tokens |
|
||||
|----|------|---|-------|--------|
|
||||
| #2332 | 1:07 AM | 🔴 | Critical Bugfix: Session ID NULL Constraint | ~201 |
|
||||
| #2340 | 1:10 AM | 🟠 | Remove Redundant Summary Section | ~280 |
|
||||
| #2344 | 1:34 AM | 🔵 | Added progressive disclosure usage instructions | ~149 |
|
||||
```
|
||||
|
||||
**Benefits:**
|
||||
- ✅ Claude knows **what** learnings exist (titles/types)
|
||||
- ✅ Token counts inform **cost-benefit** decisions (fetch ~200 tokens vs re-read 2000-line file)
|
||||
- ✅ Progressive disclosure instructions **teach Claude** how to use the system
|
||||
- ✅ Type indicators help prioritize (critical gotchas > explanatory notes)
|
||||
|
||||
**Trade-offs:**
|
||||
- ⚠️ Higher initial token cost (~2,500 tokens vs ~800)
|
||||
- ⚠️ More visual noise in the context output
|
||||
- ❓ Unknown: Does this actually improve Claude's behavior enough to justify the cost?
|
||||
|
||||
---
|
||||
|
||||
## What's New in This Branch
|
||||
|
||||
### 1. Observation Index Display
|
||||
|
||||
Full table view of recent observations grouped by file:
|
||||
|
||||
```markdown
|
||||
### Oct 25, 2025
|
||||
|
||||
**src/hooks/context.ts**
|
||||
| ID | Time | T | Title | Tokens |
|
||||
|----|------|---|-------|--------|
|
||||
| #2296 | 12:12 AM | 🟢 | Session summaries now display date and time | ~141 |
|
||||
| #2298 | 12:44 AM | 🔵 | Timeline rendering refactored | ~231 |
|
||||
|
||||
**General**
|
||||
| ID | Time | T | Title | Tokens |
|
||||
|----|------|---|-------|--------|
|
||||
| #2301 | 12:50 AM | 🟢 | Development Task Breakdown Created | ~128 |
|
||||
```
|
||||
|
||||
### 2. Token Cost Metadata
|
||||
|
||||
Every observation shows estimated token count:
|
||||
- Helps Claude decide: "Is it worth fetching this 500-token explanation, or should I just read the code?"
|
||||
- Makes cost-benefit analysis explicit
|
||||
|
||||
### 3. Progressive Disclosure Instructions
|
||||
|
||||
New guidance section teaches Claude how to use the system:
|
||||
|
||||
```markdown
|
||||
💡 Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).
|
||||
- Use MCP search tools to fetch full observation details on-demand (Layer 2)
|
||||
- Prefer searching observations over re-reading code for past decisions and learnings
|
||||
- Critical types (🔴 gotcha, 🟤 decision, ⚖️ trade-off) often worth fetching immediately
|
||||
```
|
||||
|
||||
### 4. Type-Based Priority System
|
||||
|
||||
Observations categorized by importance:
|
||||
- 🔴 **gotcha** - Critical bugs/blockers (fetch immediately)
|
||||
- 🟤 **decision** - Architectural choices (high value)
|
||||
- ⚖️ **trade-off** - Design considerations (prevents re-debating)
|
||||
- 🟠 **why-it-exists** - Rationale documentation
|
||||
- 🟡 **problem-solution** - How issues were solved
|
||||
- 🟣 **discovery** - Important learnings
|
||||
- 🔵 **how-it-works** - Explanatory/educational
|
||||
- 🟢 **what-changed** - Implementation details
|
||||
|
||||
---
|
||||
|
||||
## Testing Instructions
|
||||
|
||||
### Option 1: Quick Test (No Installation)
|
||||
|
||||
```bash
|
||||
# Clone and checkout experimental branch
|
||||
git clone https://github.com/thedotmack/claude-mem.git
|
||||
cd claude-mem
|
||||
git checkout feature/context-with-observations
|
||||
|
||||
# Build the experimental version
|
||||
npm install
|
||||
npm run build
|
||||
|
||||
# Navigate to YOUR project directory
|
||||
cd /path/to/your/project
|
||||
|
||||
# Run the experimental context hook with full path
|
||||
node /path/to/claude-mem/plugin/scripts/context-hook.js
|
||||
|
||||
# Example:
|
||||
# cd ~/my-app
|
||||
# node ~/Downloads/claude-mem/plugin/scripts/context-hook.js
|
||||
```
|
||||
|
||||
**Important:** The context hook reads from the current working directory (cwd). You must run it from your project's root folder to see context for that specific project.
|
||||
|
||||
This shows you the new context format without installing the plugin.
|
||||
|
||||
### Option 2: Full Testing (Install Locally)
|
||||
|
||||
If you're already using claude-mem and want to test the experimental version:
|
||||
|
||||
```bash
|
||||
# Navigate to your local claude-mem plugin directory
|
||||
cd ~/.claude/plugins/marketplaces/thedotmack
|
||||
|
||||
# Checkout experimental branch
|
||||
git fetch origin
|
||||
git checkout feature/context-with-observations
|
||||
|
||||
# Rebuild
|
||||
npm install
|
||||
npm run build
|
||||
|
||||
# Restart Claude Code to see the new context injection
|
||||
```
|
||||
|
||||
**⚠️ Warning:** This will replace your current context hook. To revert:
|
||||
```bash
|
||||
git checkout main
|
||||
npm run build
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## What We Want to Know
|
||||
|
||||
Please test the experimental branch and share your feedback on these questions:
|
||||
|
||||
### 1. Behavioral Impact
|
||||
- ✅ **Does Claude use MCP search more effectively?**
|
||||
- Does it fetch observation details more often?
|
||||
- Does it make better decisions about when to search vs read code?
|
||||
|
||||
### 2. Token Cost Analysis
|
||||
- 💰 **Do token counts influence Claude's retrieval decisions?**
|
||||
- Does Claude reference the token counts when deciding whether to fetch?
|
||||
- Example: "This observation is 500 tokens, so I'll read the code instead"
|
||||
|
||||
### 3. Instruction Effectiveness
|
||||
- 📖 **Is the progressive disclosure guidance helpful or noisy?**
|
||||
- Does Claude seem to understand the layered retrieval concept?
|
||||
- Do the instructions clutter the context or improve clarity?
|
||||
|
||||
### 4. Efficiency Gains
|
||||
- 🚀 **Does it reduce redundant code reading?**
|
||||
- Does Claude fetch learnings instead of re-reading entire files?
|
||||
- Overall: Is it faster/smarter despite the higher initial token cost?
|
||||
|
||||
### 5. User Experience
|
||||
- 👤 **Is the observation table too cluttered?**
|
||||
- Does the table format help or hurt readability?
|
||||
- Would you prefer a different presentation?
|
||||
|
||||
---
|
||||
|
||||
## How to Provide Feedback
|
||||
|
||||
### 📣 GitHub Issues (Please Use This!)
|
||||
|
||||
**[→ Click here to open a new issue](https://github.com/thedotmack/claude-mem/issues/new)**
|
||||
|
||||
Add the label `feedback: progressive-disclosure` and use this template:
|
||||
|
||||
```markdown
|
||||
## Progressive Disclosure Feedback
|
||||
|
||||
**Branch tested:** feature/context-with-observations
|
||||
**Test duration:** [e.g., 2 days, 10 sessions]
|
||||
**Project type:** [e.g., TypeScript library, React app, Python backend]
|
||||
|
||||
### What worked well:
|
||||
- [Your positive observations]
|
||||
|
||||
### What didn't work:
|
||||
- [Issues or concerns]
|
||||
|
||||
### Specific answers:
|
||||
1. **Claude's MCP search usage:** [Improved/Same/Worse]
|
||||
2. **Token count influence:** [Yes/No/Unclear]
|
||||
3. **Instructions helpful:** [Yes/No/Too verbose]
|
||||
4. **Code reading reduction:** [Yes/No/Hard to tell]
|
||||
5. **Overall impression:** [Worth merging/Needs work/Not useful]
|
||||
|
||||
### Additional notes:
|
||||
[Any other feedback, screenshots, or examples]
|
||||
```
|
||||
|
||||
**Why issues?** It keeps all feedback in one searchable place and lets other users see what's being discussed. Please don't hesitate to open an issue - all feedback is valuable, positive or negative!
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
Based on feedback, we'll decide:
|
||||
|
||||
### ✅ If Successful:
|
||||
- Merge to `main` branch
|
||||
- Release as v4.3.0
|
||||
- Make progressive disclosure the default
|
||||
- Potentially add verbosity settings (minimal/standard/detailed)
|
||||
|
||||
### ⚠️ If Mixed Results:
|
||||
- Make it opt-in via settings: `CLAUDE_MEM_VERBOSE_CONTEXT=true`
|
||||
- Default to current minimal approach
|
||||
- Allow users to choose their preference
|
||||
|
||||
### ❌ If Unsuccessful:
|
||||
- Keep as experimental branch
|
||||
- Continue iterating on the approach
|
||||
- May explore alternative presentation formats
|
||||
|
||||
---
|
||||
|
||||
## Technical Details
|
||||
|
||||
### Files Changed
|
||||
|
||||
- **src/hooks/context.ts** (lines 227-240)
|
||||
- Added progressive disclosure instructions
|
||||
- Enhanced observation table rendering
|
||||
- Token count display for each observation
|
||||
|
||||
### Token Cost Breakdown
|
||||
|
||||
**Current version (v4.2.x):**
|
||||
- Session summaries only: ~800 tokens
|
||||
- 3 sessions × ~250 tokens each
|
||||
- Minimal overhead
|
||||
|
||||
**Experimental version:**
|
||||
- Progressive disclosure instructions: ~150 tokens
|
||||
- Observation index: ~2,000 tokens
|
||||
- 50 observations × ~40 tokens per row
|
||||
- Session summaries: ~800 tokens
|
||||
- **Total: ~2,950 tokens**
|
||||
|
||||
**ROI Analysis:**
|
||||
- If this prevents even ONE 2,000-token file read per session, it pays for itself
|
||||
- If Claude makes smarter retrieval decisions, overall token usage could be lower
|
||||
|
||||
---
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
This experimental feature was inspired by:
|
||||
- Anthropic's "Effective context engineering for AI agents" (Sept 2025)
|
||||
- Claude Skills' progressive disclosure architecture (Oct 2025)
|
||||
- Real-world usage patterns from 200+ GitHub stars in 36 hours
|
||||
|
||||
Special thanks to our early adopters for pushing the boundaries of what's possible with persistent memory!
|
||||
|
||||
---
|
||||
|
||||
## Questions?
|
||||
|
||||
- 📖 **Docs:** [docs/](docs/)
|
||||
- 🐛 **Issues:** [GitHub Issues](https://github.com/thedotmack/claude-mem/issues)
|
||||
- 💬 **Discussion:** [GitHub Discussions](https://github.com/thedotmack/claude-mem/discussions)
|
||||
|
||||
---
|
||||
|
||||
**Happy Testing!** 🧪
|
||||
|
||||
We're excited to hear what you discover with progressive disclosure. This could be a game-changer for how Claude leverages long-term memory, but we need your real-world testing to validate the approach.
|
||||
|
||||
— Alex Newman ([@thedotmack](https://github.com/thedotmack))
|
||||
@@ -1,486 +0,0 @@
|
||||
# Feature Implementation Plan: Hybrid Search (Chroma + SQLite)
|
||||
|
||||
## Status: Experimental validation complete, ready for production implementation
|
||||
|
||||
## Experiment Results Summary
|
||||
|
||||
**Branch:** `experiment/chroma-mcp`
|
||||
**Validation:** Semantic search (Chroma) + Temporal filtering (SQLite) working correctly
|
||||
**Collection:** `cm__claude-mem` with 2,800+ documents synced
|
||||
**Decision:** Proceed with production implementation
|
||||
|
||||
---
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Phase 1: Clean Start
|
||||
|
||||
#### 1.1 Create Feature Branch
|
||||
```bash
|
||||
# Start from clean main branch
|
||||
git checkout main
|
||||
git pull origin main
|
||||
|
||||
# Create new feature branch
|
||||
git branch feature/hybrid-search
|
||||
git checkout feature/hybrid-search
|
||||
```
|
||||
|
||||
#### 1.2 Port Working Experiment Scripts
|
||||
|
||||
**Files to keep (these work correctly):**
|
||||
- `experiment/chroma-sync-experiment.ts` - Syncs SQLite → Chroma
|
||||
- `experiment/chroma-search-test.ts` - Validates search quality
|
||||
- `experiment/README.md` - Experiment documentation
|
||||
- `experiment/RESULTS.md` - Update with accurate current results
|
||||
|
||||
**Actions:**
|
||||
```bash
|
||||
# Cherry-pick only the experiment files from experiment/chroma-mcp
|
||||
git checkout experiment/chroma-mcp -- experiment/
|
||||
|
||||
# Remove any experiment artifacts that reference old implementation
|
||||
# (test-chroma-connection.ts uses broken ChromaOrchestrator)
|
||||
git rm experiment/../test-chroma-connection.ts 2>/dev/null || true
|
||||
|
||||
# Commit clean experiment baseline
|
||||
git commit -m "Add validated Chroma search experiments"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Production Architecture
|
||||
|
||||
#### 2.1 Design Principles
|
||||
|
||||
**Core Rules:**
|
||||
1. ✅ Direct MCP client usage (no wrapper abstractions)
|
||||
2. ✅ Inline helper functions (no ChromaOrchestrator)
|
||||
3. ✅ Each search workflow is deterministic (no fallbacks)
|
||||
4. ✅ Temporal boundaries prevent stale results
|
||||
5. ✅ Chroma handles semantic ranking, SQLite handles recency
|
||||
|
||||
**File Structure:**
|
||||
```
|
||||
src/
|
||||
├── servers/
|
||||
│ └── search-server.ts # Hybrid MCP server (SQLite + Chroma)
|
||||
├── services/
|
||||
│ ├── sqlite/
|
||||
│ │ ├── SessionStore.ts # SQLite CRUD (unchanged)
|
||||
│ │ └── SessionSearch.ts # FTS5 search (fallback if Chroma fails)
|
||||
│ └── sync/
|
||||
│ └── ChromaSync.ts # NEW: Sync SQLite → Chroma on observation save
|
||||
└── shared/
|
||||
└── paths.ts # Add VECTOR_DB_DIR constant
|
||||
```
|
||||
|
||||
#### 2.2 Search Workflows
|
||||
|
||||
**Workflow 1: search_observations (Semantic-First, Temporally-Bounded)**
|
||||
```
|
||||
User Query → Chroma Semantic Search (top 100)
|
||||
→ Filter: created_at_epoch > (now - 90 days)
|
||||
→ SQLite: Hydrate full records
|
||||
→ Sort: created_at_epoch DESC
|
||||
→ Return: Recent + semantically relevant
|
||||
```
|
||||
|
||||
**Workflow 2: find_by_concept/type/file (Metadata-First, Semantic-Enhanced)**
|
||||
```
|
||||
User Query → SQLite: Filter by metadata (type/concept/file)
|
||||
→ Chroma: Rank filtered IDs by semantic relevance
|
||||
→ SQLite: Hydrate in semantic rank order
|
||||
→ Return: Metadata-filtered + semantically ranked
|
||||
```
|
||||
|
||||
**Workflow 3: search_sessions (SQLite FTS5 only)**
|
||||
```
|
||||
User Query → SQLite FTS5 search (sessions are already summarized)
|
||||
→ Return: Keyword matches
|
||||
```
|
||||
|
||||
**Workflow 4: get_recent_context (Temporal-First, No Semantic)**
|
||||
```
|
||||
Hook Request → SQLite: Last 50 observations ORDER BY created_at_epoch DESC
|
||||
→ Return: Most recent context (no semantic ranking needed)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Implementation Steps
|
||||
|
||||
#### 3.1 Add Chroma Support to search-server.ts
|
||||
|
||||
**File:** `src/servers/search-server.ts`
|
||||
|
||||
**Changes:**
|
||||
1. Add Chroma MCP client initialization (lines 20-26):
|
||||
```typescript
|
||||
let chromaClient: Client;
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
```
|
||||
|
||||
2. Add `queryChroma()` helper function with proper Python dict parsing:
|
||||
```typescript
|
||||
async function queryChroma(
|
||||
query: string,
|
||||
limit: number,
|
||||
whereFilter?: Record<string, any>
|
||||
): Promise<{ ids: number[]; distances: number[]; metadatas: any[] }>
|
||||
```
|
||||
|
||||
3. Initialize Chroma client in `main()`:
|
||||
```typescript
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
chromaClient = new Client({...});
|
||||
await chromaClient.connect(chromaTransport);
|
||||
```
|
||||
|
||||
4. Update `search_observations` handler (lines 350-427):
|
||||
- Replace FTS5 search with Chroma semantic search
|
||||
- Add 90-day temporal filter
|
||||
- Hydrate from SQLite in temporal order
|
||||
|
||||
5. Update `find_by_concept` handler (lines 501-575):
|
||||
- SQLite metadata filter first
|
||||
- Chroma semantic ranking second
|
||||
- Preserve semantic rank order in final results
|
||||
|
||||
6. Update `find_by_type` handler (lines 720-797):
|
||||
- Same pattern as find_by_concept
|
||||
|
||||
7. Update `find_by_file` handler (lines 592-700):
|
||||
- Same pattern as find_by_concept
|
||||
|
||||
**IMPORTANT:**
|
||||
- Keep `SessionSearch` as fallback (if Chroma client fails to connect)
|
||||
- Add error handling: if Chroma query fails, fall back to FTS5
|
||||
- Log all Chroma operations to stderr for debugging
|
||||
|
||||
#### 3.2 Add VECTOR_DB_DIR Path Constant
|
||||
|
||||
**File:** `src/shared/paths.ts`
|
||||
|
||||
```typescript
|
||||
export const VECTOR_DB_DIR = path.join(DATA_DIR, 'vector-db');
|
||||
```
|
||||
|
||||
#### 3.3 Add Automatic Sync Service
|
||||
|
||||
**NEW File:** `src/services/sync/ChromaSync.ts`
|
||||
|
||||
**Purpose:** Automatically sync new observations to Chroma when worker saves them
|
||||
|
||||
**Key Methods:**
|
||||
```typescript
|
||||
class ChromaSync {
|
||||
async syncObservation(obs: Observation): Promise<void>
|
||||
async syncBatch(observations: Observation[]): Promise<void>
|
||||
async ensureCollection(): Promise<void>
|
||||
}
|
||||
```
|
||||
|
||||
**Integration Point:**
|
||||
- `worker-service.ts` - After saving observation to SQLite, call `chromaSync.syncObservation()`
|
||||
- Batch sync on startup: sync any observations not yet in Chroma
|
||||
|
||||
**Document Format (per experiment):**
|
||||
```typescript
|
||||
// Each observation creates multiple Chroma documents (one per semantic chunk)
|
||||
id: `obs_${obs.id}_title`
|
||||
document: obs.title
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
|
||||
id: `obs_${obs.id}_narrative`
|
||||
document: obs.narrative
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
|
||||
// Facts become individual searchable chunks
|
||||
id: `obs_${obs.id}_fact_${i}`
|
||||
document: fact
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Build and Validation
|
||||
|
||||
#### 4.1 Build Process
|
||||
```bash
|
||||
# Build all scripts
|
||||
npm run build
|
||||
|
||||
# Verify outputs
|
||||
ls -lh plugin/scripts/search-server.js # Should exist (ESM)
|
||||
ls -lh plugin/scripts/search-server.cjs # Should NOT exist (delete if present)
|
||||
|
||||
# Check build format
|
||||
head -1 plugin/scripts/search-server.js # Should show: #!/usr/bin/env node
|
||||
```
|
||||
|
||||
#### 4.2 Validation Checklist
|
||||
|
||||
**✅ Pre-deployment checks:**
|
||||
1. Run sync experiment: `npx tsx experiment/chroma-sync-experiment.ts`
|
||||
- Verify collection created
|
||||
- Verify documents synced
|
||||
- Check document count matches observations
|
||||
|
||||
2. Run search test: `npx tsx experiment/chroma-search-test.ts`
|
||||
- Verify semantic queries return results
|
||||
- Compare quality vs FTS5
|
||||
- Document results in RESULTS.md
|
||||
|
||||
3. Test MCP server standalone:
|
||||
```bash
|
||||
# Start server manually
|
||||
node plugin/scripts/search-server.js
|
||||
|
||||
# In another terminal, test with MCP inspector
|
||||
npx @modelcontextprotocol/inspector node plugin/scripts/search-server.js
|
||||
```
|
||||
|
||||
4. Test with Claude Code:
|
||||
```bash
|
||||
# Deploy to plugin directory
|
||||
cp -r plugin/* ~/.claude/plugins/marketplaces/thedotmack/
|
||||
|
||||
# Restart worker
|
||||
pm2 restart claude-mem-worker
|
||||
|
||||
# Start new Claude session and test search tools
|
||||
```
|
||||
|
||||
**✅ Smoke tests:**
|
||||
- Search for recent work: Should return last 90 days
|
||||
- Search for old concepts: Should filter by recency
|
||||
- Search by file: Should return file-specific observations
|
||||
- Search by type: Should return only that type
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Documentation
|
||||
|
||||
#### 5.1 Update CLAUDE.md
|
||||
|
||||
Add to "What It Does" section:
|
||||
```markdown
|
||||
### Hybrid Search Architecture
|
||||
|
||||
Claude-mem uses a hybrid search system combining:
|
||||
- **Semantic Search (Chroma)**: Vector embeddings for conceptual understanding
|
||||
- **Keyword Search (SQLite FTS5)**: Full-text search for exact matches
|
||||
- **Temporal Filtering**: 90-day recency boundary prevents stale results
|
||||
|
||||
Search workflows automatically choose the optimal combination:
|
||||
- Conceptual queries → Semantic-first, temporally-bounded
|
||||
- Metadata queries → Metadata-first, semantically-enhanced
|
||||
- Recent context → Temporal-first (no semantic ranking)
|
||||
```
|
||||
|
||||
#### 5.2 Update Architecture Section
|
||||
|
||||
```markdown
|
||||
### Vector Database Layer
|
||||
|
||||
**Technology**: ChromaDB via Chroma MCP server
|
||||
**Location**: `~/.claude-mem/vector-db/`
|
||||
**Collection**: `cm__claude-mem`
|
||||
|
||||
**Sync Strategy**:
|
||||
- Worker service syncs observations to Chroma after SQLite save
|
||||
- Each observation creates multiple vector documents (title, narrative, facts)
|
||||
- Metadata includes `sqlite_id` for cross-reference
|
||||
|
||||
**Search Strategy**:
|
||||
- Semantic queries use Chroma with 90-day temporal filter
|
||||
- Metadata queries filter SQLite first, then semantic rank
|
||||
- Fallback to FTS5 if Chroma unavailable
|
||||
```
|
||||
|
||||
#### 5.3 Write Release Notes
|
||||
|
||||
**File:** `EXPERIMENTAL_RELEASE_NOTES.md`
|
||||
|
||||
```markdown
|
||||
# Hybrid Search Release (v4.4.0)
|
||||
|
||||
## Breaking Changes
|
||||
None - Search MCP tools maintain same interface
|
||||
|
||||
## New Features
|
||||
|
||||
### Semantic Search via Chroma
|
||||
- Added ChromaDB integration for vector-based semantic search
|
||||
- Observations automatically synced to vector database
|
||||
- Search understands conceptual queries (not just keywords)
|
||||
|
||||
### Hybrid Search Workflows
|
||||
- `search_observations`: Semantic search with 90-day recency filter
|
||||
- `find_by_concept/type/file`: Metadata filtering + semantic ranking
|
||||
- Automatic fallback to FTS5 if Chroma unavailable
|
||||
|
||||
### Sync Automation
|
||||
- Worker service auto-syncs new observations to Chroma
|
||||
- Batch sync on startup for any missing observations
|
||||
- Collection: `cm__claude-mem` in `~/.claude-mem/vector-db/`
|
||||
|
||||
## Technical Details
|
||||
|
||||
**New Dependencies:**
|
||||
- `@modelcontextprotocol/sdk` (already present)
|
||||
- External: `uvx chroma-mcp` (Python package via uvx)
|
||||
|
||||
**New Files:**
|
||||
- `src/services/sync/ChromaSync.ts` - Auto-sync service
|
||||
- `experiment/chroma-sync-experiment.ts` - Manual sync tool
|
||||
- `experiment/chroma-search-test.ts` - Search quality validator
|
||||
|
||||
**Modified Files:**
|
||||
- `src/servers/search-server.ts` - Hybrid search implementation
|
||||
- `src/services/worker-service.ts` - Auto-sync integration
|
||||
- `src/shared/paths.ts` - Added VECTOR_DB_DIR constant
|
||||
|
||||
**Design Rationale:**
|
||||
- Temporal boundaries prevent old semantically-perfect matches from outranking recent updates
|
||||
- Metadata-first filtering eliminates irrelevant categories before semantic ranking
|
||||
- Direct MCP client usage avoids abstraction overhead
|
||||
- Inline helpers keep parsing logic close to usage
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 6: Deployment
|
||||
|
||||
#### 6.1 Pre-merge Validation
|
||||
```bash
|
||||
# Ensure all tests pass
|
||||
npm run build
|
||||
npm run test:parser # If applicable
|
||||
|
||||
# Validate experiment results
|
||||
npx tsx experiment/chroma-sync-experiment.ts
|
||||
npx tsx experiment/chroma-search-test.ts
|
||||
|
||||
# Test production MCP server
|
||||
node plugin/scripts/search-server.js &
|
||||
# Send test queries via MCP inspector
|
||||
|
||||
# Clean build artifacts
|
||||
rm -f plugin/scripts/*.cjs # Remove stale CommonJS builds
|
||||
```
|
||||
|
||||
#### 6.2 Commit Strategy
|
||||
```bash
|
||||
# Commit 1: Experiment scripts (already done if following plan)
|
||||
git add experiment/
|
||||
git commit -m "Add validated Chroma search experiments"
|
||||
|
||||
# Commit 2: Core implementation
|
||||
git add src/servers/search-server.ts src/shared/paths.ts
|
||||
git commit -m "Implement hybrid search: Chroma semantic + SQLite temporal"
|
||||
|
||||
# Commit 3: Auto-sync service
|
||||
git add src/services/sync/ src/services/worker-service.ts
|
||||
git commit -m "Add automatic observation sync to Chroma vector DB"
|
||||
|
||||
# Commit 4: Documentation
|
||||
git add CLAUDE.md EXPERIMENTAL_RELEASE_NOTES.md
|
||||
git commit -m "Document hybrid search architecture and usage"
|
||||
|
||||
# Commit 5: Build artifacts
|
||||
npm run build
|
||||
git add plugin/scripts/
|
||||
git commit -m "Build hybrid search implementation"
|
||||
```
|
||||
|
||||
#### 6.3 Merge to Main
|
||||
```bash
|
||||
# Push feature branch
|
||||
git push origin feature/hybrid-search
|
||||
|
||||
# Create PR or merge directly (your choice)
|
||||
git checkout main
|
||||
git merge feature/hybrid-search
|
||||
git push origin main
|
||||
|
||||
# Tag release
|
||||
git tag v4.4.0
|
||||
git push origin v4.4.0
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Rollback Plan
|
||||
|
||||
If issues arise post-deployment:
|
||||
|
||||
```bash
|
||||
# Quick rollback
|
||||
git checkout main
|
||||
git revert HEAD~5..HEAD # Revert last 5 commits
|
||||
git push origin main
|
||||
|
||||
# Or cherry-pick the revert
|
||||
git checkout -b hotfix/rollback-hybrid-search
|
||||
git revert <commit-sha>
|
||||
git push origin hotfix/rollback-hybrid-search
|
||||
```
|
||||
|
||||
**Chroma data cleanup (if needed):**
|
||||
```bash
|
||||
# Remove vector database
|
||||
rm -rf ~/.claude-mem/vector-db/
|
||||
|
||||
# Search server will fall back to FTS5 if Chroma unavailable
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
**Must have before merge:**
|
||||
- ✅ Sync experiment completes without errors
|
||||
- ✅ Search test shows Chroma returning results
|
||||
- ✅ MCP server starts and responds to queries
|
||||
- ✅ Fallback to FTS5 works if Chroma unavailable
|
||||
- ✅ No breaking changes to existing MCP tool interfaces
|
||||
- ✅ Documentation updated
|
||||
- ✅ No uncommitted changes
|
||||
- ✅ No dead code (ChromaOrchestrator removed)
|
||||
- ✅ No stale build artifacts (.cjs files)
|
||||
|
||||
**Nice to have:**
|
||||
- Performance benchmarks (Chroma vs FTS5 query time)
|
||||
- Search quality metrics (relevance scores)
|
||||
- Token usage comparison (semantic vs keyword results)
|
||||
|
||||
---
|
||||
|
||||
## Timeline Estimate
|
||||
|
||||
- Phase 1 (Clean Start): 15 minutes
|
||||
- Phase 2 (Architecture Review): 30 minutes
|
||||
- Phase 3 (Implementation): 2-3 hours
|
||||
- Phase 4 (Validation): 1 hour
|
||||
- Phase 5 (Documentation): 1 hour
|
||||
- Phase 6 (Deployment): 30 minutes
|
||||
|
||||
**Total: ~5-6 hours** for complete, validated implementation
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- The experiment validated that semantic search works and provides value
|
||||
- This plan avoids all the mistakes from the previous attempt:
|
||||
- ✅ Clean branch from main (no baggage)
|
||||
- ✅ Implementation AFTER experiment validation
|
||||
- ✅ No dead code (ChromaOrchestrator)
|
||||
- ✅ Proper commit strategy
|
||||
- ✅ Complete documentation
|
||||
- ✅ Validation at every step
|
||||
@@ -1,83 +0,0 @@
|
||||
# 🧪 Experimental: Progressive Disclosure Context System
|
||||
|
||||
> **We'd love your feedback!** Test the new context injection approach and share your experience.
|
||||
|
||||
## What is Progressive Disclosure?
|
||||
|
||||
A **layered memory retrieval system** that shows Claude:
|
||||
1. **Index** (frontloaded): What observations exist + token costs
|
||||
2. **Details** (on-demand): Full narratives via MCP search
|
||||
3. **Perfect recall**: Source code when needed
|
||||
|
||||
**The idea:** Instead of hiding observations completely, show an index so Claude can make informed decisions about what to fetch.
|
||||
|
||||
## Try It Out
|
||||
|
||||
```bash
|
||||
# Clone and build experimental version
|
||||
git clone https://github.com/thedotmack/claude-mem.git
|
||||
cd claude-mem
|
||||
git checkout feature/context-with-observations
|
||||
npm install && npm run build
|
||||
|
||||
# Navigate to YOUR project and run the hook
|
||||
cd /path/to/your/project
|
||||
node /path/to/claude-mem/plugin/scripts/context-hook.js
|
||||
```
|
||||
|
||||
**Important:** Run from your project's root directory to see context for that project.
|
||||
|
||||
## What's Different?
|
||||
|
||||
**Current (v4.2.x):** Session summaries only (~800 tokens)
|
||||
```markdown
|
||||
Session #312: Put date/time at end of session titles
|
||||
Completed: Added formatting
|
||||
Next: Test edge cases
|
||||
```
|
||||
|
||||
**Experimental:** Observation index + summaries (~2,500 tokens)
|
||||
```markdown
|
||||
**src/hooks/context.ts**
|
||||
| ID | Time | T | Title | Tokens |
|
||||
|----|------|---|-------|--------|
|
||||
| #2332 | 1:07 AM | 🔴 | Critical Bugfix: Session ID NULL | ~201 |
|
||||
| #2340 | 1:10 AM | 🟠 | Remove Redundant Summary Section | ~280 |
|
||||
```
|
||||
|
||||
Now Claude knows:
|
||||
- What learnings exist (without loading them)
|
||||
- Cost to fetch details (~200 tokens)
|
||||
- Priority (🔴 critical vs 🔵 informational)
|
||||
|
||||
## We Want Your Feedback
|
||||
|
||||
Test the experimental branch and tell us:
|
||||
|
||||
✅ **Does Claude use MCP search more effectively?**
|
||||
💰 **Do token counts influence retrieval decisions?**
|
||||
📖 **Are the instructions helpful or noisy?**
|
||||
🚀 **Does it reduce redundant code reading?**
|
||||
|
||||
### 📣 [Please Open a GitHub Issue](https://github.com/thedotmack/claude-mem/issues/new) With Your Experience!
|
||||
|
||||
Use the label `feedback: progressive-disclosure` - all feedback is valuable, positive or negative!
|
||||
|
||||
## Files Changed
|
||||
|
||||
- Updated `README.md` with experimental feature section
|
||||
- Enhanced `src/hooks/context.ts` with progressive disclosure instructions
|
||||
- New docs: `EXPERIMENTAL_RELEASE_NOTES.md` (full details)
|
||||
|
||||
## Next Steps
|
||||
|
||||
Based on your feedback:
|
||||
- ✅ **If successful:** Merge to main, release as v4.3.0
|
||||
- ⚠️ **If mixed:** Make opt-in via settings
|
||||
- ❌ **If unsuccessful:** Keep iterating as experimental
|
||||
|
||||
---
|
||||
|
||||
**Full details:** See [EXPERIMENTAL_RELEASE_NOTES.md](EXPERIMENTAL_RELEASE_NOTES.md)
|
||||
|
||||
**Questions?** Join the discussion or open an issue!
|
||||
@@ -1,503 +0,0 @@
|
||||
# Hybrid Search Implementation Status
|
||||
|
||||
**Branch**: `feature/hybrid-search`
|
||||
**Date**: 2025-10-31
|
||||
**Status**: ⚠️ **PARTIALLY COMPLETE** - Needs completion and validation
|
||||
|
||||
---
|
||||
|
||||
## Executive Summary
|
||||
|
||||
The hybrid search feature combines semantic search (ChromaDB) with temporal filtering (SQLite) to provide better context retrieval for the claude-mem memory system. The experimental validation and initial implementation have been completed, but the production implementation is **incomplete** and requires additional work before merging to main.
|
||||
|
||||
### Quick Status
|
||||
- ✅ **Experiment validated**: Chroma sync and search workflows work
|
||||
- ⚠️ **Implementation incomplete**: search-server.ts partially updated
|
||||
- ❌ **Auto-sync missing**: ChromaSync service not yet implemented
|
||||
- ❌ **Testing incomplete**: MCP server not fully validated
|
||||
- ❌ **Documentation pending**: CLAUDE.md and release notes not updated
|
||||
|
||||
---
|
||||
|
||||
## What Was Done
|
||||
|
||||
### 1. Experimental Validation (Commits: 867226c, 309e8a7)
|
||||
|
||||
**Files Added**:
|
||||
- `experiment/chroma-sync-experiment.ts` - Manual sync tool (works ✅)
|
||||
- `experiment/chroma-search-test.ts` - Search quality validator (works ✅)
|
||||
- `experiment/README.md` - Experiment documentation
|
||||
- `experiment/RESULTS.md` - Search quality comparison results
|
||||
|
||||
**Key Findings**:
|
||||
- ✅ Chroma MCP connection works via `uvx chroma-mcp`
|
||||
- ✅ Collection `cm__claude-mem` successfully created
|
||||
- ✅ 1,390 observations synced → 8,279 vector documents
|
||||
- ✅ Document format validated: `obs_{id}_{field}` with metadata
|
||||
- ⚠️ Search quality results are **INCONCLUSIVE** (see Critical Issues below)
|
||||
|
||||
### 2. Planning Documents
|
||||
|
||||
**Files Created**:
|
||||
- `FEATURE_PLAN_HYBRID_SEARCH.md` (486 lines) - Comprehensive 6-phase implementation plan
|
||||
- `NEXT_SESSION_PROMPT.md` (193 lines) - Session continuation instructions
|
||||
|
||||
**Plan Structure**:
|
||||
1. Phase 1: Clean Start ✅ (completed)
|
||||
2. Phase 2: Architecture Review ✅ (documented)
|
||||
3. Phase 3: Implementation ⚠️ (partially complete)
|
||||
4. Phase 4: Validation ❌ (not started)
|
||||
5. Phase 5: Documentation ❌ (not started)
|
||||
6. Phase 6: Deployment ❌ (not started)
|
||||
|
||||
### 3. Production Code Changes
|
||||
|
||||
#### src/servers/search-server.ts (319 lines added)
|
||||
|
||||
**What Works**:
|
||||
- ✅ Chroma MCP client imports added
|
||||
- ✅ `queryChroma()` helper function implemented (95 lines)
|
||||
- Handles Python dict parsing with regex
|
||||
- Extracts IDs from document format `obs_{id}_{field}`
|
||||
- Parses distances and metadata correctly
|
||||
- ✅ `search_observations` handler updated with hybrid workflow
|
||||
- Chroma semantic search (top 100)
|
||||
- 90-day temporal filter
|
||||
- SQLite hydration in temporal order
|
||||
- FTS5 fallback if Chroma fails
|
||||
- ⚠️ `find_by_concept` handler **partially** updated
|
||||
- Metadata-first filtering via SQLite
|
||||
- Semantic ranking via Chroma
|
||||
- **INCOMPLETE**: Implementation cut off mid-function (line 554 in diff)
|
||||
|
||||
**What's Missing**:
|
||||
- ❌ Chroma client initialization in `main()` function
|
||||
- ❌ `find_by_type` handler not updated
|
||||
- ❌ `find_by_file` handler not updated
|
||||
- ❌ Error handling not comprehensive
|
||||
- ❌ Logging not fully implemented
|
||||
|
||||
#### src/services/sqlite/SessionStore.ts (27 lines added)
|
||||
|
||||
**What Works**:
|
||||
- ✅ `getObservationsByIds()` method added (lines 622-645)
|
||||
- Accepts array of IDs
|
||||
- Supports temporal ordering (date_desc/date_asc)
|
||||
- Supports limit parameter
|
||||
- Uses parameterized queries (SQL injection safe)
|
||||
|
||||
#### src/shared/paths.ts (1 line added)
|
||||
|
||||
**What Works**:
|
||||
- ✅ `VECTOR_DB_DIR` constant added
|
||||
- Points to `~/.claude-mem/vector-db/`
|
||||
- Used by Chroma MCP client
|
||||
|
||||
---
|
||||
|
||||
## What's Next (Critical Path)
|
||||
|
||||
### Immediate Blockers (Must Fix Before Merge)
|
||||
|
||||
#### 1. Complete search-server.ts Implementation
|
||||
|
||||
**File**: `src/servers/search-server.ts`
|
||||
|
||||
**Missing Code**:
|
||||
|
||||
a) **Initialize Chroma client in main() function** (~20 lines):
|
||||
```typescript
|
||||
// Add to main() function before server.connect()
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
chromaClient = new Client(
|
||||
{ name: 'claude-mem-search-chroma-client', version: '1.0.0' },
|
||||
{ capabilities: {} }
|
||||
);
|
||||
await chromaClient.connect(chromaTransport);
|
||||
console.error('[search-server] Chroma client connected');
|
||||
```
|
||||
|
||||
b) **Complete find_by_concept handler** (~30 lines):
|
||||
- The implementation is cut off mid-function
|
||||
- Need to complete the semantic ranking logic
|
||||
- Need to hydrate results from SQLite in semantic rank order
|
||||
- Need to add error handling and FTS5 fallback
|
||||
|
||||
c) **Update find_by_type handler** (~50 lines):
|
||||
- Same pattern as find_by_concept
|
||||
- Metadata filter first (SQLite)
|
||||
- Semantic ranking second (Chroma)
|
||||
- Preserve rank order in results
|
||||
|
||||
d) **Update find_by_file handler** (~50 lines):
|
||||
- Same pattern as find_by_concept
|
||||
- File path filter first (SQLite)
|
||||
- Semantic ranking second (Chroma)
|
||||
- Preserve rank order in results
|
||||
|
||||
**Total Estimated Effort**: 2-3 hours
|
||||
|
||||
#### 2. Implement Auto-Sync Service
|
||||
|
||||
**NEW File**: `src/services/sync/ChromaSync.ts` (~200 lines)
|
||||
|
||||
**Purpose**: Automatically sync new observations to Chroma when worker saves them
|
||||
|
||||
**Required Methods**:
|
||||
```typescript
|
||||
class ChromaSync {
|
||||
async syncObservation(obs: Observation): Promise<void>
|
||||
async syncBatch(observations: Observation[]): Promise<void>
|
||||
async ensureCollection(): Promise<void>
|
||||
private async connectChroma(): Promise<void>
|
||||
private formatObservationDocuments(obs: Observation): ChromaDocument[]
|
||||
}
|
||||
```
|
||||
|
||||
**Integration Points**:
|
||||
- `src/services/worker-service.ts` - Call after saving observation to SQLite
|
||||
- Batch sync on startup for any missing observations
|
||||
- Use same document format as experiment: `obs_{id}_{field}`
|
||||
|
||||
**Total Estimated Effort**: 2-3 hours
|
||||
|
||||
#### 3. Build and Validation
|
||||
|
||||
**Steps**:
|
||||
1. Build all scripts: `npm run build`
|
||||
2. Verify ESM format: `head -1 plugin/scripts/search-server.js`
|
||||
3. Delete stale builds: `rm -f plugin/scripts/*.cjs`
|
||||
4. Test sync: `npx tsx experiment/chroma-sync-experiment.ts`
|
||||
5. Test search: `npx tsx experiment/chroma-search-test.ts`
|
||||
6. Test MCP server: Start manually and query via MCP inspector
|
||||
7. Deploy and test in Claude Code session
|
||||
|
||||
**Total Estimated Effort**: 1-2 hours
|
||||
|
||||
#### 4. Documentation Updates
|
||||
|
||||
**Files to Update**:
|
||||
- `CLAUDE.md` - Add "Hybrid Search Architecture" section
|
||||
- `CLAUDE.md` - Add "Vector Database Layer" section
|
||||
- `CHANGELOG.md` - Add v4.4.0 release notes
|
||||
- Consider: `EXPERIMENTAL_RELEASE_NOTES.md` (as suggested in plan)
|
||||
|
||||
**Total Estimated Effort**: 1 hour
|
||||
|
||||
---
|
||||
|
||||
## Critical Issues & Concerns
|
||||
|
||||
### 🔴 Issue #1: Inconclusive Search Quality Results
|
||||
|
||||
**Problem**: The experiment results in `RESULTS.md` show **contradictory** data:
|
||||
|
||||
- **Header claims**: "Semantic search outperformed by 3 queries (100% vs 63%)"
|
||||
- **Actual results**: Chroma returned "No results" for 8/8 test queries
|
||||
- **FTS5 results**: Returned results for 5/8 queries
|
||||
|
||||
**Analysis**:
|
||||
Looking at the actual query results, **every semantic search query failed**:
|
||||
- Query 1 (conceptual): Chroma ❌ No results, FTS5 ❌ No results
|
||||
- Query 2 (patterns): Chroma ❌ No results, FTS5 ✅ 1 result
|
||||
- Query 3 (file): Chroma ❌ No results, FTS5 ✅ 3 results
|
||||
- Query 4 (function): Chroma ❌ No results, FTS5 ✅ 3 results
|
||||
- Query 5 (technical): Chroma ❌ No results, FTS5 ❌ No results
|
||||
- Query 6 (intent): Chroma ❌ No results, FTS5 ✅ 1 result
|
||||
- Query 7 (error): Chroma ❌ No results, FTS5 ✅ 3 results
|
||||
- Query 8 (design): Chroma ❌ No results, FTS5 ❌ No results
|
||||
|
||||
**Conclusion**: The summary at the top is **incorrect**. FTS5 actually outperformed Chroma 5-0.
|
||||
|
||||
**Root Cause Hypothesis**:
|
||||
- The sync experiment created 8,279 documents from 1,390 observations
|
||||
- The search test may have run **before** sync completed
|
||||
- Or search test is using wrong collection name
|
||||
- Or search test has a query parsing bug
|
||||
|
||||
**Action Required**:
|
||||
- ✅ Re-run sync experiment (verified working above)
|
||||
- ⚠️ Re-run search test to get accurate results
|
||||
- ⚠️ Update RESULTS.md with correct findings
|
||||
- ⚠️ **VALIDATE** that semantic search actually provides value before proceeding
|
||||
|
||||
### 🔴 Issue #2: Incomplete Implementation Cut Off Mid-Function
|
||||
|
||||
**Problem**: The `find_by_concept` handler in search-server.ts is incomplete (line 554 in diff). The code literally ends with:
|
||||
```typescript
|
||||
if (ids.includes(chromaId) && !rankedIds.includes(chromaId)) {
|
||||
rankedIds.push(chromaId);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Impact**:
|
||||
- Handler won't work (syntax error likely)
|
||||
- Can't test metadata-enhanced search workflows
|
||||
- Blocks validation of core feature
|
||||
|
||||
**Action Required**:
|
||||
- Complete the handler implementation
|
||||
- Add error handling
|
||||
- Add FTS5 fallback
|
||||
- Test with actual queries
|
||||
|
||||
### 🟡 Issue #3: No Auto-Sync Implementation
|
||||
|
||||
**Problem**: The ChromaSync service doesn't exist yet. Without it:
|
||||
- New observations won't appear in semantic search results
|
||||
- Users must manually run sync experiment after each session
|
||||
- Chroma database will become stale over time
|
||||
|
||||
**Impact**:
|
||||
- Feature is not production-ready
|
||||
- User experience is broken (missing recent context)
|
||||
- Manual intervention required after every coding session
|
||||
|
||||
**Action Required**:
|
||||
- Implement `src/services/sync/ChromaSync.ts`
|
||||
- Integrate with worker-service.ts
|
||||
- Add batch sync on startup
|
||||
- Test sync pipeline end-to-end
|
||||
|
||||
### 🟡 Issue #4: Chroma Client Not Initialized
|
||||
|
||||
**Problem**: The search-server.ts declares `chromaClient` variable but never initializes it in `main()`.
|
||||
|
||||
**Impact**:
|
||||
- All Chroma queries will fail with "Chroma client not initialized"
|
||||
- Code will fall back to FTS5 for every query
|
||||
- Hybrid search feature is effectively disabled
|
||||
|
||||
**Action Required**:
|
||||
- Add client initialization to `main()` function
|
||||
- Add connection error handling
|
||||
- Log connection status for debugging
|
||||
|
||||
---
|
||||
|
||||
## Technical Debt & Concerns
|
||||
|
||||
### Design Pattern: Direct MCP Client Usage
|
||||
|
||||
**Current Approach**: The implementation uses direct MCP client calls with inline parsing helpers.
|
||||
|
||||
**Pros**:
|
||||
- ✅ No abstraction overhead
|
||||
- ✅ Parsing logic close to usage
|
||||
- ✅ Avoids ChromaOrchestrator dead code pattern from experiment/chroma-mcp branch
|
||||
|
||||
**Cons**:
|
||||
- ⚠️ Duplicated parsing logic (queryChroma helper called multiple times)
|
||||
- ⚠️ Python dict parsing with regex is fragile
|
||||
- ⚠️ Error handling must be duplicated across handlers
|
||||
|
||||
**Recommendation**: Current approach is acceptable, but consider extracting parsing logic to shared utility if it becomes more complex.
|
||||
|
||||
### Temporal Boundary: 90-Day Filter
|
||||
|
||||
**Current Setting**: Hard-coded 90-day recency window in search_observations handler.
|
||||
|
||||
**Concerns**:
|
||||
- Not configurable
|
||||
- May be too short for long-running projects
|
||||
- May be too long for fast-moving projects
|
||||
- No user control over recency vs semantic relevance trade-off
|
||||
|
||||
**Recommendation**: Consider making this configurable via MCP tool parameter in future iteration. For v4.4.0, 90 days is a reasonable default.
|
||||
|
||||
### FTS5 Fallback Strategy
|
||||
|
||||
**Current Approach**: Each handler tries Chroma first, falls back to FTS5 on error.
|
||||
|
||||
**Pros**:
|
||||
- ✅ Graceful degradation if Chroma unavailable
|
||||
- ✅ No user-facing errors
|
||||
|
||||
**Cons**:
|
||||
- ⚠️ Silent performance degradation (user doesn't know semantic search failed)
|
||||
- ⚠️ No metrics on fallback frequency
|
||||
- ⚠️ Doesn't distinguish between Chroma connection failure vs empty results
|
||||
|
||||
**Recommendation**: Add telemetry/logging to track fallback frequency. Consider user-visible warnings if Chroma consistently unavailable.
|
||||
|
||||
---
|
||||
|
||||
## Validation Checklist (From Plan)
|
||||
|
||||
### Pre-Merge Requirements
|
||||
|
||||
**Code Completeness**:
|
||||
- ❌ search-server.ts: Complete all handler implementations
|
||||
- ❌ search-server.ts: Initialize Chroma client in main()
|
||||
- ❌ ChromaSync.ts: Implement auto-sync service
|
||||
- ❌ worker-service.ts: Integrate auto-sync calls
|
||||
|
||||
**Testing**:
|
||||
- ⚠️ Sync experiment works (verified partially above)
|
||||
- ❌ Search test shows Chroma returning relevant results (currently failing)
|
||||
- ❌ MCP server starts and responds to queries
|
||||
- ❌ Fallback to FTS5 works if Chroma unavailable
|
||||
- ❌ Smoke tests pass (recent work, old concepts, file search, type search)
|
||||
|
||||
**Code Quality**:
|
||||
- ✅ No breaking changes to MCP tool interfaces
|
||||
- ✅ No dead code (ChromaOrchestrator not present)
|
||||
- ⚠️ No stale build artifacts (need to verify)
|
||||
- ❌ No uncommitted changes (will check after completion)
|
||||
|
||||
**Documentation**:
|
||||
- ❌ CLAUDE.md updated with hybrid search architecture
|
||||
- ❌ CHANGELOG.md has v4.4.0 release notes
|
||||
- ❌ Experiment results validated and accurate
|
||||
|
||||
**Build**:
|
||||
- ❌ Build succeeds without errors
|
||||
- ❌ search-server.js is ESM format (not CJS)
|
||||
- ❌ All hook scripts built correctly
|
||||
|
||||
---
|
||||
|
||||
## Recommended Next Steps
|
||||
|
||||
### Option A: Complete the Implementation (Recommended)
|
||||
|
||||
**Timeline**: 6-8 hours total
|
||||
|
||||
**Steps**:
|
||||
1. **Re-validate experiments** (1 hour)
|
||||
- Delete and re-sync Chroma collection
|
||||
- Run search test and verify results
|
||||
- Update RESULTS.md with accurate findings
|
||||
- **DECISION POINT**: If semantic search doesn't work, stop here
|
||||
|
||||
2. **Complete search-server.ts** (2-3 hours)
|
||||
- Initialize Chroma client
|
||||
- Complete find_by_concept handler
|
||||
- Implement find_by_type handler
|
||||
- Implement find_by_file handler
|
||||
- Add comprehensive error handling
|
||||
|
||||
3. **Implement ChromaSync** (2-3 hours)
|
||||
- Create src/services/sync/ChromaSync.ts
|
||||
- Integrate with worker-service.ts
|
||||
- Test sync pipeline
|
||||
|
||||
4. **Validate and Document** (2 hours)
|
||||
- Build and test MCP server
|
||||
- Run smoke tests in Claude Code
|
||||
- Update CLAUDE.md
|
||||
- Write release notes
|
||||
|
||||
5. **Deploy** (30 minutes)
|
||||
- Merge to main
|
||||
- Tag v4.4.0
|
||||
- Deploy to production
|
||||
|
||||
### Option B: Pause and Re-Validate (Conservative)
|
||||
|
||||
**Timeline**: 2-3 hours
|
||||
|
||||
**Steps**:
|
||||
1. Re-run search quality experiments with fresh sync
|
||||
2. Get accurate performance comparison data
|
||||
3. **DECISION**: Proceed with implementation OR abandon feature
|
||||
4. If abandoning: Document findings, close branch, move on
|
||||
5. If proceeding: Continue with Option A
|
||||
|
||||
### Option C: Ship Minimal Version (Fast Path)
|
||||
|
||||
**Timeline**: 4-5 hours
|
||||
|
||||
**Steps**:
|
||||
1. Complete only search_observations handler (skip metadata handlers)
|
||||
2. Skip auto-sync (keep manual sync experiment)
|
||||
3. Document as "experimental feature"
|
||||
4. Merge with feature flag to disable by default
|
||||
5. Iterate in future versions
|
||||
|
||||
---
|
||||
|
||||
## File Changes Summary
|
||||
|
||||
### Added Files (6)
|
||||
- `experiment/README.md` (53 lines)
|
||||
- `experiment/RESULTS.md` (210 lines)
|
||||
- `experiment/chroma-search-test.ts` (304 lines)
|
||||
- `experiment/chroma-sync-experiment.ts` (315 lines)
|
||||
- `FEATURE_PLAN_HYBRID_SEARCH.md` (486 lines)
|
||||
- `NEXT_SESSION_PROMPT.md` (193 lines)
|
||||
|
||||
### Modified Files (10)
|
||||
- `src/servers/search-server.ts` (+319 lines)
|
||||
- `src/services/sqlite/SessionStore.ts` (+27 lines)
|
||||
- `src/shared/paths.ts` (+1 line)
|
||||
- `plugin/scripts/cleanup-hook.js` (rebuilt)
|
||||
- `plugin/scripts/context-hook.js` (rebuilt)
|
||||
- `plugin/scripts/new-hook.js` (rebuilt)
|
||||
- `plugin/scripts/save-hook.js` (rebuilt)
|
||||
- `plugin/scripts/search-server.js` (rebuilt)
|
||||
- `plugin/scripts/summary-hook.js` (rebuilt)
|
||||
- `plugin/scripts/worker-service.cjs` (rebuilt)
|
||||
|
||||
### Files to Create
|
||||
- `src/services/sync/ChromaSync.ts` (new, ~200 lines)
|
||||
- `EXPERIMENTAL_RELEASE_NOTES.md` (optional)
|
||||
|
||||
### Files to Update
|
||||
- `CLAUDE.md` (add hybrid search sections)
|
||||
- `CHANGELOG.md` (add v4.4.0 release notes)
|
||||
- `experiment/RESULTS.md` (fix incorrect summary)
|
||||
|
||||
---
|
||||
|
||||
## Timeline Estimate
|
||||
|
||||
From FEATURE_PLAN_HYBRID_SEARCH.md:
|
||||
|
||||
| Phase | Status | Time Estimate |
|
||||
|-------|--------|---------------|
|
||||
| Phase 1: Clean Start | ✅ Complete | 15 min (done) |
|
||||
| Phase 2: Architecture Review | ✅ Complete | 30 min (done) |
|
||||
| Phase 3: Implementation | ⚠️ 40% done | 2-3 hours (remaining) |
|
||||
| Phase 4: Validation | ❌ Not started | 1 hour |
|
||||
| Phase 5: Documentation | ❌ Not started | 1 hour |
|
||||
| Phase 6: Deployment | ❌ Not started | 30 min |
|
||||
| **TOTAL** | **~40% complete** | **~5-6 hours remaining** |
|
||||
|
||||
---
|
||||
|
||||
## Related Sessions (from claude-mem context)
|
||||
|
||||
- **Session #S558**: Critical analysis of experiment/chroma-mcp branch (different branch, has issues)
|
||||
- **Session #S559**: Critical analysis of THIS branch (identified design validation complete)
|
||||
- **Session #S560**: Created NEXT_SESSION_PROMPT.md with corrective plan
|
||||
- **Session #S561**: Attempted to start but NEXT_SESSION_PROMPT.md was missing (now exists)
|
||||
|
||||
**Key Observation from Session #2975**:
|
||||
> "Hybrid Search Architecture Validated for Production Implementation"
|
||||
|
||||
However, this appears to be based on the **incorrect** summary in RESULTS.md. The actual test results show Chroma failing all queries. This needs re-validation before proceeding.
|
||||
|
||||
---
|
||||
|
||||
## Conclusion
|
||||
|
||||
The hybrid search feature is **partially implemented** and requires **5-6 hours of focused work** to complete. The most critical blocker is **validating that semantic search actually works** - the current RESULTS.md shows contradictory data.
|
||||
|
||||
**Recommended Action**:
|
||||
1. Re-run search quality experiments with fresh sync
|
||||
2. Get accurate performance data
|
||||
3. Make GO/NO-GO decision based on real results
|
||||
4. If GO: Complete implementation per Option A
|
||||
5. If NO-GO: Document findings and close branch
|
||||
|
||||
**Risk Assessment**:
|
||||
- 🔴 **HIGH**: Search quality results are contradictory and unvalidated
|
||||
- 🟡 **MEDIUM**: Implementation is incomplete (missing handlers + auto-sync)
|
||||
- 🟢 **LOW**: Architecture is sound, experiment scripts work, plan is comprehensive
|
||||
|
||||
**Confidence Level**: 60% - The feature CAN work, but needs validation and completion before merge.
|
||||
@@ -1,193 +0,0 @@
|
||||
# Prompt for Next Session: Hybrid Search Implementation
|
||||
|
||||
Copy this entire prompt into a new Claude Code session to continue the hybrid search feature implementation.
|
||||
|
||||
---
|
||||
|
||||
## Context
|
||||
|
||||
I'm working on the `claude-mem` project (persistent memory system for Claude Code). I have an experimental branch `experiment/chroma-mcp` that attempted to add semantic search via ChromaDB, but it has implementation issues and was done in the wrong order.
|
||||
|
||||
**Current Status:**
|
||||
- ✅ Experiment validated: Semantic search (Chroma) + temporal filtering (SQLite) works
|
||||
- ✅ Chroma collection `cm__claude-mem` has 2,800+ documents synced
|
||||
- ✅ Search quality tests show semantic search provides value
|
||||
- ❌ Production implementation has issues (dead code, uncommitted fixes, wrong process)
|
||||
- ✅ Feature plan written and ready to execute
|
||||
|
||||
**Your Task:**
|
||||
Follow the feature implementation plan in `FEATURE_PLAN_HYBRID_SEARCH.md` to implement hybrid search correctly from the ground up.
|
||||
|
||||
---
|
||||
|
||||
## Immediate Actions
|
||||
|
||||
1. **Read the feature plan:**
|
||||
```
|
||||
Read: /Users/alexnewman/Scripts/claude-mem/FEATURE_PLAN_HYBRID_SEARCH.md
|
||||
```
|
||||
|
||||
2. **Understand the experiment results:**
|
||||
- The experiment scripts work correctly
|
||||
- Chroma semantic search is functional
|
||||
- We just need to implement it properly in production
|
||||
|
||||
3. **Execute Phase 1 of the plan:**
|
||||
- Create new `feature/hybrid-search` branch from `main`
|
||||
- Port working experiment scripts from `experiment/chroma-mcp`
|
||||
- Clean up any dead code references
|
||||
|
||||
---
|
||||
|
||||
## Key Principles for This Implementation
|
||||
|
||||
1. **Start clean:** New branch from `main`, no baggage from failed attempt
|
||||
2. **No abstractions:** Direct MCP client usage, no ChromaOrchestrator wrapper
|
||||
3. **Validate at each step:** Don't commit until you've tested it works
|
||||
4. **Proper parsing:** Chroma MCP returns Python dicts, not JSON - use regex parsing
|
||||
5. **Temporal boundaries:** 90-day filter prevents stale semantic matches
|
||||
|
||||
---
|
||||
|
||||
## Files You'll Need to Work With
|
||||
|
||||
**Core Implementation:**
|
||||
- `src/servers/search-server.ts` - Add hybrid search workflows
|
||||
- `src/services/sync/ChromaSync.ts` - NEW: Auto-sync observations to Chroma
|
||||
- `src/services/worker-service.ts` - Integrate auto-sync
|
||||
- `src/shared/paths.ts` - Add VECTOR_DB_DIR constant
|
||||
|
||||
**Experiment Files (keep these, they work):**
|
||||
- `experiment/chroma-sync-experiment.ts` - Manual sync tool
|
||||
- `experiment/chroma-search-test.ts` - Search quality validator
|
||||
|
||||
**Files to DELETE (dead code from failed attempt):**
|
||||
- `src/services/chroma/ChromaOrchestrator.ts` - Broken wrapper, never used
|
||||
- `test-chroma-connection.ts` - Uses broken ChromaOrchestrator
|
||||
- `plugin/scripts/search-server.cjs` - Stale CommonJS build
|
||||
|
||||
---
|
||||
|
||||
## Validation Checklist
|
||||
|
||||
Before committing any code, verify:
|
||||
|
||||
```bash
|
||||
# 1. Build succeeds
|
||||
npm run build
|
||||
|
||||
# 2. Sync works
|
||||
npx tsx experiment/chroma-sync-experiment.ts
|
||||
|
||||
# 3. Search works
|
||||
npx tsx experiment/chroma-search-test.ts
|
||||
|
||||
# 4. MCP server starts
|
||||
node plugin/scripts/search-server.js
|
||||
# (Ctrl+C to stop)
|
||||
|
||||
# 5. No dead code
|
||||
grep -r "ChromaOrchestrator" src/ # Should return nothing
|
||||
|
||||
# 6. No stale builds
|
||||
ls plugin/scripts/search-server.cjs # Should not exist
|
||||
|
||||
# 7. Git status clean
|
||||
git status # No uncommitted changes to production files
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Workflow (from Phase 3 of plan)
|
||||
|
||||
### Step 1: Add queryChroma Helper
|
||||
In `src/servers/search-server.ts`, add a helper function that:
|
||||
- Takes: `query: string, limit: number, whereFilter?: object`
|
||||
- Calls: `chromaClient.callTool({ name: 'chroma_query_documents', ... })`
|
||||
- Parses: Python dict response with regex (see lines 256-318 in current branch for example)
|
||||
- Returns: `{ ids: number[], distances: number[], metadatas: any[] }`
|
||||
|
||||
### Step 2: Initialize Chroma Client
|
||||
In `main()` function:
|
||||
```typescript
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
chromaClient = new Client({ name: 'claude-mem-search-chroma-client', version: '1.0.0' }, { capabilities: {} });
|
||||
await chromaClient.connect(chromaTransport);
|
||||
```
|
||||
|
||||
### Step 3: Update search_observations Handler
|
||||
Replace FTS5 keyword search with:
|
||||
1. Chroma semantic search (top 100)
|
||||
2. Filter by recency (90 days)
|
||||
3. Hydrate from SQLite in temporal order
|
||||
4. Return results
|
||||
|
||||
### Step 4: Update Metadata Search Handlers
|
||||
For `find_by_concept`, `find_by_type`, `find_by_file`:
|
||||
1. SQLite metadata filter first
|
||||
2. Chroma semantic ranking second
|
||||
3. Preserve semantic rank order in results
|
||||
|
||||
---
|
||||
|
||||
## Expected Timeline
|
||||
|
||||
- Phase 1 (Clean Start): 15 minutes
|
||||
- Phase 2 (Architecture Review): Already done, read the plan
|
||||
- Phase 3 (Implementation): 2-3 hours
|
||||
- Phase 4 (Validation): 1 hour
|
||||
- Phase 5 (Documentation): 1 hour
|
||||
- Phase 6 (Deployment): 30 minutes
|
||||
|
||||
**Total: ~5-6 hours**
|
||||
|
||||
---
|
||||
|
||||
## Questions to Ask Me
|
||||
|
||||
If you encounter any issues:
|
||||
|
||||
1. "The Chroma MCP client isn't connecting" → Check if `uvx chroma-mcp` is available
|
||||
2. "Parsing errors from Chroma responses" → Show me the response format, I'll help fix regex
|
||||
3. "Not sure about the search workflow logic" → Reference Phase 2.2 in the plan
|
||||
4. "Should I commit now?" → Only if validation checklist passes
|
||||
5. "Merge to main or PR?" → I'll decide, just get to Phase 6 first
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
Don't merge until ALL of these are true:
|
||||
|
||||
- ✅ Sync experiment completes without errors
|
||||
- ✅ Search test shows Chroma returning relevant results
|
||||
- ✅ MCP server starts and responds to queries
|
||||
- ✅ Fallback to FTS5 works if Chroma unavailable
|
||||
- ✅ No breaking changes to MCP tool interfaces
|
||||
- ✅ Documentation updated (CLAUDE.md + release notes)
|
||||
- ✅ No uncommitted changes in git status
|
||||
- ✅ No dead code (ChromaOrchestrator removed)
|
||||
- ✅ No stale build artifacts (.cjs files deleted)
|
||||
|
||||
---
|
||||
|
||||
## Start Here
|
||||
|
||||
```
|
||||
1. Read the feature plan:
|
||||
Read: /Users/alexnewman/Scripts/claude-mem/FEATURE_PLAN_HYBRID_SEARCH.md
|
||||
|
||||
2. Create the feature branch:
|
||||
Bash: git checkout main && git pull && git checkout -b feature/hybrid-search
|
||||
|
||||
3. Begin Phase 1 of the plan (porting experiment scripts)
|
||||
|
||||
4. Work through each phase systematically, validating at each step
|
||||
|
||||
5. Ask me questions if anything is unclear
|
||||
```
|
||||
|
||||
Let's build this correctly, from the ground up. Take your time and validate at each step.
|
||||
@@ -1,84 +0,0 @@
|
||||
# LinkedIn Launch Post - Claude-mem v5.0
|
||||
|
||||
Every developer using Claude Code knows this workflow:
|
||||
|
||||
/init → Claude learns your codebase
|
||||
Work for a while → Context fills up
|
||||
/clear → Everything's gone
|
||||
Next session → Re-learn everything again
|
||||
|
||||
**Your AI coding assistant has amnesia.**
|
||||
|
||||
And it's costing you money and time on every session.
|
||||
|
||||
## The Solution
|
||||
|
||||
I built claude-mem: a persistent memory system that makes Claude remember across sessions.
|
||||
|
||||
Not conversation summaries. Not compressed chat logs. Actual persistent memory—capturing every tool execution, processing it with AI, and making it instantly recallable.
|
||||
|
||||
## How It Works
|
||||
|
||||
**Hybrid Architecture:**
|
||||
- ChromaDB for semantic vector search (finds conceptually relevant context)
|
||||
- SQLite for temporal ordering (newest information first)
|
||||
- FTS5 keyword search as fallback (works without Python)
|
||||
|
||||
**Automatic Context Loading:**
|
||||
Every session start loads your last 50 observations in <200ms. No /init. No research phase.
|
||||
|
||||
You see:
|
||||
→ What you were working on (session summaries)
|
||||
→ What Claude learned (bugfixes, features, decisions)
|
||||
→ Chronological timeline (newest first)
|
||||
→ Token costs (so you know what's expensive to recall)
|
||||
|
||||
## The Breakthrough: Temporal Context
|
||||
|
||||
Most AI memory systems focus on semantic similarity. But that's only half the equation.
|
||||
|
||||
**Without timestamps, information becomes stale.** A bugfix from yesterday is more relevant than architecture notes from last month—even if the semantic similarity is lower.
|
||||
|
||||
Claude-mem combines both: semantic relevance + temporal recency.
|
||||
|
||||
The result? Claude starts each session knowing your current codebase state. No re-learning. No wasted tokens.
|
||||
|
||||
## Real-World Impact
|
||||
|
||||
After months of development across 1,400+ sessions:
|
||||
- 8,200+ vector documents indexed
|
||||
- <200ms query performance
|
||||
- Session startup context loads automatically
|
||||
- Natural language search when you need something from weeks ago
|
||||
|
||||
My Claude rarely needs to /init anymore. Hit /clear, start new session, keep working.
|
||||
|
||||
## The Paradox
|
||||
|
||||
Claude-mem's startup context got so good that Claude rarely uses the search tools.
|
||||
|
||||
The last 50 observations is usually enough. But when you need to recall something specific from weeks ago, the context timeline instantly reconstructs that moment.
|
||||
|
||||
Development becomes **pleasant instead of repetitive.**
|
||||
**Token-efficient instead of wasteful.**
|
||||
**Focused instead of constantly re-explaining.**
|
||||
|
||||
---
|
||||
|
||||
**claude-mem v5.0 just shipped** 🚀
|
||||
|
||||
Open source (AGPL-3.0): https://github.com/thedotmack/claude-mem
|
||||
|
||||
Install in Claude Code:
|
||||
```
|
||||
/plugin marketplace add thedotmack/claude-mem
|
||||
/plugin install claude-mem
|
||||
```
|
||||
|
||||
Python optional but recommended for semantic search. Falls back to keyword search without it.
|
||||
|
||||
---
|
||||
|
||||
**Question for the community:** How much time do you spend re-explaining your codebase to AI assistants after clearing context?
|
||||
|
||||
#AI #DeveloperTools #ProductivityTools #ClaudeAI #OpenSource #VectorDatabase #SemanticSearch #DeveloperProductivity
|
||||
@@ -1,114 +0,0 @@
|
||||
# Your Claude forgets everything after /clear. Mine doesn't.
|
||||
|
||||
You know the cycle.
|
||||
|
||||
/init to learn your codebase. Claude reads everything, understands your architecture, builds context.
|
||||
|
||||
You work for a while. Context window fills up. Eventually you hit /clear.
|
||||
|
||||
Everything's gone.
|
||||
|
||||
Next session: Claude reads CLAUDE.md again. Does the research again. Re-learns your codebase again.
|
||||
|
||||
**Tokens cost money. Research takes time. Claude forgets.**
|
||||
|
||||
This cycle is killing productivity.
|
||||
|
||||
## I built persistent memory that survives /clear
|
||||
|
||||
Not summaries. Not compressed conversations. [Actual persistent memory](https://github.com/thedotmack/claude-mem)—capture everything Claude does, process it with AI, make it instantly recallable across sessions.
|
||||
|
||||
Early on I tried vector stores, MCPs, memory tools. ChromaDB for vector search. But documents were massive—great for semantic matching, terrible for context efficiency.
|
||||
|
||||
That led to the hybrid approach.
|
||||
|
||||
## How it works
|
||||
|
||||
SQLite database with semantic chunking. ChromaDB for vector search when you need it—incredibly fast, incredibly relevant. FTS5 keyword search as fallback.
|
||||
|
||||
The magic? This loads automatically at every session start. No /init. No research phase.
|
||||
|
||||
Here's what I see when I start a new session on my "claude-mem-performance" project:
|
||||
|
||||
```
|
||||
📝 [claude-mem-performance] recent context
|
||||
────────────────────────────────────────────────────────────
|
||||
|
||||
Legend: 🎯 session-request | 🔴 bugfix | 🟣 feature | 🔄 refactor | ✅ change | 🔵 discovery | 🧠 decision
|
||||
|
||||
💡 Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).
|
||||
→ Use MCP search tools to fetch full observation details on-demand (Layer 2)
|
||||
→ Prefer searching observations over re-reading code for past decisions and learnings
|
||||
→ Critical types (🔴 bugfix, 🧠 decision) often worth fetching immediately
|
||||
|
||||
Nov 3, 2025
|
||||
|
||||
🎯 #S651 Read headless-test.md and use plan mode to prepare for writing a test (Nov 3, 1:27 PM) [claude-mem://session-summary/651]
|
||||
|
||||
🎯 #S650 Read headless-test.md and use plan mode to prepare for writing a test (Nov 3, 1:27 PM) [claude-mem://session-summary/650]
|
||||
|
||||
test_automation.ts
|
||||
#3280 1:31 PM ✅ Updated test automation prompts for Kanban board project (~125t)
|
||||
|
||||
🎯 #S652 Read headless-test.md and use plan mode to prepare for writing the test (Nov 3, 1:32 PM) [claude-mem://session-summary/652]
|
||||
|
||||
General
|
||||
#3281 1:33 PM 🔵 Examined test automation script (~70t)
|
||||
|
||||
test_automation.ts
|
||||
#3282 1:34 PM 🟣 Implemented full verbose output mode for tool execution visibility (~145t)
|
||||
#3283 1:35 PM ✅ Enhanced plan generation streaming with partial message support (~109t)
|
||||
|
||||
🎯 #S653 Read headless-test.md and use plan mode to prepare for writing the test (Nov 3, 1:35 PM)
|
||||
|
||||
Completed: Modified the generatePlan function in test_automation.ts to support `includePartialMessages: true` and integrate the streamMessage handler for unified streaming output. This improves the real-time feedback mechanism during plan generation.
|
||||
|
||||
Next Steps: 1. Read and analyze headless-test.md to understand test requirements. 2. Use plan mode to generate a test implementation strategy. 3. Write the actual test based on the plan.
|
||||
```
|
||||
|
||||
**What you're seeing:**
|
||||
- Session summaries (🎯) - what you were working on
|
||||
- What Claude learned - observations with type indicators (bugfix, feature, change, discovery)
|
||||
- Token costs - so you know what's expensive to recall
|
||||
- Chronological flow - recent work, newest first
|
||||
- Loaded in <200ms at session start
|
||||
|
||||
Timeline order: your past sessions, Claude's work, what was learned, what's next.
|
||||
|
||||
And when you need something from weeks ago? Natural language search + instant timeline replay gets you there in <200ms.
|
||||
|
||||
## The breakthrough: temporal context
|
||||
|
||||
Most memories are duplicate knowledge. Your architecture doesn't fundamentally change every session.
|
||||
|
||||
But some memories are **changes**. Bugfixes. Refactors. Decisions.
|
||||
|
||||
Without timestamps, without knowing what's "newest," your information is stale. And stale information means Claude has to research—the token-heavy work I'm trying to eliminate.
|
||||
|
||||
## The paradox
|
||||
|
||||
Claude-mem's startup context got so good that Claude rarely uses the search tools anymore.
|
||||
|
||||
The last 50 observations at session start is usually enough. /clear doesn't reset anything—next session starts exactly where you left off.
|
||||
|
||||
But when you need to recall something specific from weeks ago, the context timeline instantly gets Claude back in the game for that exact task.
|
||||
|
||||
**No /init. No research phase. No re-learning.**
|
||||
|
||||
Just: start session, Claude knows your codebase, you work.
|
||||
|
||||
Development becomes pleasant instead of repetitive. Token-efficient instead of wasteful. Focused instead of constantly re-explaining.
|
||||
|
||||
---
|
||||
|
||||
**claude-mem v5.0** just shipped: https://github.com/thedotmack/claude-mem
|
||||
|
||||
Python optional but recommended for semantic search. Falls back to keyword search if you don't have it.
|
||||
|
||||
**Install in Claude Code:**
|
||||
```
|
||||
/plugin marketplace add thedotmack/claude-mem
|
||||
/plugin install claude-mem
|
||||
```
|
||||
|
||||
Anyone else tired of both paying and WAITING for Claude to re-learn their codebase after every /clear?
|
||||
@@ -1,81 +0,0 @@
|
||||
# The problem with AI memory isn't storage—it's the research tax
|
||||
|
||||
Every time you ask Claude to work on something, there's this invisible token cost you're paying before it even starts: contextualization.
|
||||
|
||||
"Fix the auth bug" requires Claude to first figure out:
|
||||
- What auth system are you using?
|
||||
- What changed recently?
|
||||
- What was the last decision about auth?
|
||||
- Is that info even current, or is it from 3 weeks ago before the refactor?
|
||||
|
||||
That research phase? That's your context window disappearing.
|
||||
|
||||
## I tried everything
|
||||
|
||||
Early in claude-mem's development, I was using ChromaDB for vector search. Semantic matching was great—find conceptually similar stuff across thousands of memories.
|
||||
|
||||
But here's what I learned watching the system work in real-time:
|
||||
|
||||
Most memories are duplicate knowledge. Your codebase architecture doesn't change every session.
|
||||
|
||||
But some memories are **changes**. Bugfixes. Refactors. Decisions.
|
||||
|
||||
And if you can't tell which one is the newest change, your information is stale, and Claude has to go researching. Which brings us back to: wasting tokens.
|
||||
|
||||
## Vector search alone isn't enough
|
||||
|
||||
Semantic search finds relevant documents. But it doesn't know that the "authentication decision" from 3 weeks ago was completely invalidated by yesterday's refactor.
|
||||
|
||||
Without temporal ordering, you get:
|
||||
- 10 memories about your auth system
|
||||
- No idea which is current
|
||||
- Claude has to read them all and infer chronology
|
||||
- Token waste
|
||||
|
||||
That's when the hybrid architecture clicked:
|
||||
|
||||
**ChromaDB for semantic relevance** (finds conceptually related memories)
|
||||
↓
|
||||
**90-day temporal filter** (removes ancient irrelevant stuff)
|
||||
↓
|
||||
**SQLite chronological ordering** (newest first)
|
||||
|
||||
Now when you search "auth changes," you get a timeline. Not a pile of memories you have to sort through.
|
||||
|
||||
## The "instant replay" feature
|
||||
|
||||
v5.0 adds something I'm calling timeline-on-demand.
|
||||
|
||||
You say: "Work on that feature from 2 weeks ago"
|
||||
|
||||
Instead of:
|
||||
1. Search for "feature"
|
||||
2. Get 50 results
|
||||
3. Figure out which one you meant
|
||||
4. Read context around it
|
||||
5. Start working
|
||||
|
||||
You get:
|
||||
1. Natural language search finds the anchor point
|
||||
2. Timeline reconstructs everything around that moment
|
||||
3. Claude's head is in the game, immediately
|
||||
|
||||
## The paradox I didn't expect
|
||||
|
||||
Claude-mem's startup context got so good that Claude rarely uses the search tools anymore.
|
||||
|
||||
The last 50 observations at session start is usually enough.
|
||||
|
||||
But for specific tasks—especially revisiting old work—the timeline feature gives you contextualization-on-demand without burning through your context window on research.
|
||||
|
||||
You're paying for focused context, not broad context.
|
||||
|
||||
That's the difference.
|
||||
|
||||
---
|
||||
|
||||
**Repo**: https://github.com/thedotmack/claude-mem
|
||||
|
||||
v5.0 just shipped. Python optional but recommended for semantic search. Falls back to keyword search if you don't have it.
|
||||
|
||||
Thoughts? Does the "research tax" resonate with anyone else?
|
||||
@@ -1,103 +0,0 @@
|
||||
# Your Claude forgets everything after /clear. Mine doesn't.
|
||||
|
||||
You know the cycle.
|
||||
|
||||
/init to learn your codebase. Takes a few minutes. Claude reads everything, understands your architecture, builds context.
|
||||
|
||||
You work for a while. Context window fills up. You try /compact to compress the conversation—but you can't recall specific moments later, and the compressed format is more verbose than useful.
|
||||
|
||||
Eventually you hit /clear.
|
||||
|
||||
Everything's gone.
|
||||
|
||||
Next session: Claude reads CLAUDE.md again. Does the research again. Re-learns your codebase again.
|
||||
|
||||
Tokens cost money.
|
||||
|
||||
Research takes time.
|
||||
|
||||
Low context windows cause quality issues.
|
||||
|
||||
Claude forgets.
|
||||
|
||||
This cycle is killing productivity.
|
||||
|
||||
## Designing instant memory recall that survives /clear
|
||||
|
||||
I spent months building persistent memory for Claude Code. Not summaries. Not compressed conversations. Actual persistent memory—capture everything Claude does, process it with AI, make it instantly recallable across sessions.
|
||||
|
||||
/clear doesn't delete anything. The memory persists.
|
||||
|
||||
Early on I tried all kinds of vector stores, MCPs, memory tools. I was using ChromaDB for vector search.
|
||||
|
||||
The documents were big massive things. Great performance in a RAG sense—semantic matching worked. But it would use up context too quickly.
|
||||
|
||||
Either I was doing it wrong, or vector databases are just limited in what they can do.
|
||||
|
||||
That's how I ended up with the hybrid approach.
|
||||
|
||||
## Watching memories get saved live
|
||||
|
||||
The entire idea behind "temporal context" came to me as I watched memories being captured in real-time.
|
||||
|
||||
I could see that most memories were duplicate knowledge. Your codebase architecture doesn't fundamentally change every session.
|
||||
|
||||
But many memories were **changes**. Bugfixes. Refactors. Decisions.
|
||||
|
||||
And here's the thing: if you don't have the date and time associated with it, if you don't know it's the "newest" change, then your information is stale.
|
||||
|
||||
And if your information is stale, Claude has to go researching.
|
||||
|
||||
Researching is the token-heavy work I'm trying to minimize.
|
||||
|
||||
## Building v4.0 with timelines in mind
|
||||
|
||||
When I was designing claude-mem 4.0 to be a plugin architecture compatible with Claude Code 2.0, I decided to focus on the SQLite database and observation formatting first.
|
||||
|
||||
The semantic chunking was architected by design so it could be brought into ChromaDB later for the best possible results.
|
||||
|
||||
But then using the super-fast SQLite index to sort results by date, so you could search for "change" or "bugfix" and see a timeline.
|
||||
|
||||
Newest first. So you know what's current.
|
||||
|
||||
## Bringing ChromaDB back
|
||||
|
||||
Then I brought ChromaDB back to compare with FTS5 searching.
|
||||
|
||||
Chroma returned very relevant results with vector relations. FTS5 just doesn't work as well for semantic matching.
|
||||
|
||||
And it was fast. Really fast.
|
||||
|
||||
That's when the custom timeline feature clicked.
|
||||
|
||||
## The "instant replay" idea
|
||||
|
||||
My thought was: what if you ask Claude to work on a task from 3 days ago, 4 weeks ago?
|
||||
|
||||
Now you have an "instant replay" of everything that was done around whatever you're searching for.
|
||||
|
||||
Natural language search finds the anchor point. Timeline reconstructs the context around that moment. Claude's head is in the game, immediately.
|
||||
|
||||
## The paradox
|
||||
|
||||
Here's what actually happened.
|
||||
|
||||
Claude-mem's startup context got so good that Claude rarely even uses the search tools anymore.
|
||||
|
||||
The last 50 observations at session start is usually enough for whatever I'm working on. /clear doesn't reset anything—next session starts exactly where you left off.
|
||||
|
||||
But I just built out contextualization-on-demand for v5.0. When you need to recall something specific from weeks ago, the "context timeline" instantly gets Claude's head in the game for that exact task.
|
||||
|
||||
No /init. No research phase. No re-learning.
|
||||
|
||||
Just: start session, Claude knows your codebase, you work.
|
||||
|
||||
Development becomes pleasant instead of repetitive. Token-efficient instead of wasteful. Focused instead of constantly re-explaining.
|
||||
|
||||
---
|
||||
|
||||
**Repo**: https://github.com/thedotmack/claude-mem
|
||||
|
||||
v5.0 just shipped. Python optional but recommended for semantic search. Falls back to keyword search if you don't have it.
|
||||
|
||||
Does the "how to work on this task" problem resonate with anyone else?
|
||||
@@ -1,177 +0,0 @@
|
||||
# Claude-mem v5.0: I Fixed Vector Search's Time Blindness
|
||||
|
||||
Vector databases are amazing at finding similar content. Terrible at knowing *when* that content matters.
|
||||
|
||||
I just shipped claude-mem v5.0 with hybrid search—semantic relevance meets temporal context. Sub-200ms queries across 8,200+ vectors.
|
||||
|
||||
## The Problem With Pure Vector Search
|
||||
|
||||
You search for "authentication bug" in your ChromaDB. It returns:
|
||||
- That auth refactor from 6 months ago (highly similar!)
|
||||
- Login flow changes from last year (perfect match!)
|
||||
- The actual bug you fixed yesterday (similar, but not as close semantically)
|
||||
|
||||
All semantically relevant. Chronologically useless.
|
||||
|
||||
Vector search finds *what* matches. Doesn't understand *when* it matters.
|
||||
|
||||
## v4.x Had the Opposite Problem
|
||||
|
||||
SQLite FTS5 keyword search. Fast. Reliable. Token-efficient.
|
||||
|
||||
But it only matched exact keywords. "authentication bug" wouldn't find "login validation error" even though they're the same concept.
|
||||
|
||||
You had to remember your exact wording from weeks ago. Good luck with that.
|
||||
|
||||
## v5.0: Hybrid Search Pipeline
|
||||
|
||||
```
|
||||
Query → Chroma Semantic Search (top 100)
|
||||
→ 90-day Recency Filter
|
||||
→ SQLite Temporal Hydration
|
||||
→ Chronologically Ordered Results
|
||||
```
|
||||
|
||||
**What this means:**
|
||||
|
||||
1. **Chroma finds conceptually relevant matches** - "auth bug" matches "login validation error", "session timeout issue", "credential handling problem"
|
||||
|
||||
2. **90-day window filters to recent context** - Last 2-3 months of active work, automatically excludes stale results
|
||||
|
||||
3. **SQLite provides temporal ordering** - Results flow chronologically, showing how problems evolved and got solved
|
||||
|
||||
4. **Timeline reconstruction** - See the session where you hit the bug, the discovery observation, the fix, and what came next
|
||||
|
||||
## Example: Natural Language Timeline Search
|
||||
|
||||
New tool: `get_timeline_by_query`
|
||||
|
||||
**Auto mode** (search → instant timeline):
|
||||
```
|
||||
Query: "ChromaDB performance issues"
|
||||
|
||||
Found: Observation #3401 (Oct 28, 8:42 PM)
|
||||
Title: "ChromaSync batch processing optimization"
|
||||
|
||||
Timeline (depth_before=10, depth_after=10):
|
||||
├─ [10 records before] Session context, related observations
|
||||
├─ [ANCHOR] The performance fix observation
|
||||
└─ [10 records after] Test results, follow-up changes
|
||||
|
||||
Total: 21 records in chronological order
|
||||
Response: <200ms
|
||||
```
|
||||
|
||||
**Interactive mode** (pick your anchor):
|
||||
```
|
||||
Query: "authentication refactor"
|
||||
|
||||
Top 5 matches:
|
||||
#3156 - "JWT token validation overhaul" (Oct 15)
|
||||
#3089 - "Session middleware refactor" (Oct 12)
|
||||
#2947 - "OAuth integration changes" (Oct 8)
|
||||
...
|
||||
|
||||
Choose anchor → Get timeline → See full context
|
||||
```
|
||||
|
||||
## Performance: The Numbers
|
||||
|
||||
- **1,390 observations** synced to **8,279 vector documents**
|
||||
- **Semantic search**: <200ms for top 100 matches
|
||||
- **90-day filter + temporal hydration**: Negligible overhead
|
||||
- **Total query time**: <200ms end-to-end
|
||||
|
||||
This scales. I'm not searching 8K vectors every time—the 90-day window typically narrows to 500-800 recent documents before Chroma even sees them.
|
||||
|
||||
## ChromaSync: Automatic Vector Maintenance
|
||||
|
||||
New background service that syncs your SQLite data to Chroma:
|
||||
|
||||
- **Splits observations** into narrative + facts vectors (better semantic granularity)
|
||||
- **Splits summaries** into request + learned vectors
|
||||
- **Indexes user prompts** as single vectors
|
||||
- **Runs automatically** via PM2 worker service
|
||||
- **Metadata filtering** by project, type, concepts, files
|
||||
|
||||
Example: One observation → Multiple vectors for precise matching.
|
||||
|
||||
Your 500-word debugging narrative? Split into semantic chunks. Query matches the relevant section, not just "the whole document is kinda related."
|
||||
|
||||
## Graceful Fallback
|
||||
|
||||
No Python? No problem.
|
||||
|
||||
System detects missing Chroma and falls back to FTS5 keyword search. Same API, same tools, slightly less magical semantic matching.
|
||||
|
||||
You lose semantic understanding but keep full functionality. All 9 MCP search tools still work.
|
||||
|
||||
## All 9 Search Tools Now Hybrid
|
||||
|
||||
Every search method got the upgrade:
|
||||
|
||||
1. **search_observations** - Hybrid semantic + keyword across observations
|
||||
2. **search_sessions** - Hybrid across session summaries
|
||||
3. **search_user_prompts** - Hybrid across raw user input
|
||||
4. **find_by_concept** - Filter by tags + semantic similarity
|
||||
5. **find_by_file** - File references + semantic context
|
||||
6. **find_by_type** - Type filter + semantic relevance
|
||||
7. **get_recent_context** - Temporal only (no search needed)
|
||||
8. **get_context_timeline** - Timeline around anchor point
|
||||
9. **get_timeline_by_query** - Natural language timeline search
|
||||
|
||||
## Why This Matters
|
||||
|
||||
**Before v5.0:**
|
||||
- "Show me auth bugs" → Exact keyword match only
|
||||
- Miss semantically similar issues with different wording
|
||||
- No temporal context about when/how issues evolved
|
||||
|
||||
**After v5.0:**
|
||||
- "Show me auth bugs" → Finds authentication, login, session, credential issues
|
||||
- Filtered to last 90 days automatically
|
||||
- Results in chronological order showing problem evolution
|
||||
- Timeline reconstruction shows full context
|
||||
|
||||
Claude doesn't just find relevant information. Claude sees *when* it happened and what came next.
|
||||
|
||||
## Migration
|
||||
|
||||
Zero breaking changes. Your existing SQLite data continues working.
|
||||
|
||||
**Optional upgrade** for semantic search:
|
||||
```bash
|
||||
# Install Chroma MCP server (requires Python 3.8+)
|
||||
# Instructions in repo README
|
||||
|
||||
# That's it. ChromaSync detects Chroma and syncs automatically.
|
||||
```
|
||||
|
||||
First sync takes ~30 seconds for 1,400 observations. After that, incremental syncs are near-instant.
|
||||
|
||||
## The Paradox Continues
|
||||
|
||||
v5.0's hybrid search is so good that Claude *still* rarely needs to search.
|
||||
|
||||
The context-hook's 50-observation startup context usually has everything. But when Claude needs something from 6 weeks ago? Semantic search + timeline reconstruction gets it instantly.
|
||||
|
||||
No keyword guessing. No re-reading code. Just: ask in natural language, get chronological context, keep coding.
|
||||
|
||||
## Install
|
||||
|
||||
```bash
|
||||
# In Claude Code:
|
||||
/plugin marketplace add thedotmack/claude-mem
|
||||
/plugin install claude-mem
|
||||
|
||||
# Optional: Install Python + Chroma for semantic search
|
||||
# Falls back to keyword search if you don't
|
||||
```
|
||||
|
||||
**Repo:** https://github.com/thedotmack/claude-mem
|
||||
|
||||
claude-mem v5.0 combines the semantic magic of vector search with the temporal clarity of chronological ordering.
|
||||
|
||||
Finally: relevance *and* context. In under 200ms.
|
||||
|
||||
Anyone else built hybrid search systems? How did you handle the time dimension?
|
||||
@@ -1,314 +0,0 @@
|
||||
# CodeRabbit Review - Issue Validation
|
||||
|
||||
**Analysis Date:** 2025-11-03
|
||||
**Analyzed By:** Claude (Sonnet 4.5)
|
||||
**Priority:** 🔴 Critical | 🟡 Medium | 🟢 Low
|
||||
|
||||
---
|
||||
|
||||
## Issue 1: Chroma Search False Positives
|
||||
|
||||
**Location:** `experiment/chroma-search-test.ts:135-166`
|
||||
**Priority:** 🟢 Low
|
||||
**Status:** ✅ CONFIRMED - Real bug, correct fix
|
||||
**Severity:** Low (experiment file only, not production code)
|
||||
|
||||
### Problem
|
||||
The code marks `chromaFound = true` if the raw text contains the string `'ids'`, even for empty results like `'ids': [[]]`.
|
||||
|
||||
**Current code (line 137):**
|
||||
```typescript
|
||||
testResult.chromaFound = resultText.includes('ids') && resultText.length > 50;
|
||||
```
|
||||
|
||||
This creates false positives by checking for string containment rather than validating actual result content.
|
||||
|
||||
### Validation
|
||||
Confirmed by reading the actual code. The logic uses simple string matching which would match both:
|
||||
- Real results: `'ids': [['obs_123', 'obs_456']]` ✓
|
||||
- Empty results: `'ids': [[]]` ✗ (incorrectly marked as success)
|
||||
|
||||
### Recommended Fix
|
||||
Parse and validate the actual content of the `ids` and/or `documents` arrays:
|
||||
|
||||
```typescript
|
||||
// Extract and parse the 'ids' array
|
||||
const idsMatch = resultText.match(/'ids':\s*\[(.*?)\]/s);
|
||||
if (idsMatch) {
|
||||
try {
|
||||
// Check if there's at least one non-empty inner array
|
||||
const idsContent = idsMatch[1];
|
||||
const hasResults = idsContent.includes('[') &&
|
||||
!idsContent.match(/\[\s*\]/); // Not just empty arrays
|
||||
testResult.chromaFound = hasResults;
|
||||
} catch {
|
||||
testResult.chromaFound = false;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Decision
|
||||
**DEFER** - This is an experiment file, not production code. The bug doesn't affect actual functionality. Can be fixed as a cleanup task when working in this area.
|
||||
|
||||
---
|
||||
|
||||
## Issue 2: 90-Day Cutoff Units Mismatch
|
||||
|
||||
**Location:** `src/servers/search-server.ts:374-381` (and 3 other hybrid search handlers)
|
||||
**Priority:** 🔴 Critical
|
||||
**Status:** ✅ CONFIRMED - Critical bug, MUST FIX IMMEDIATELY
|
||||
**Severity:** High (breaks 90-day temporal filtering entirely)
|
||||
|
||||
### Problem
|
||||
The 90-day cutoff is computed in **seconds** but `created_at_epoch` is stored in **milliseconds**, causing the filter to never exclude anything.
|
||||
|
||||
**Current code (line 374):**
|
||||
```typescript
|
||||
const ninetyDaysAgo = Math.floor(Date.now() / 1000) - (90 * 24 * 60 * 60);
|
||||
// ...
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
```
|
||||
|
||||
### Validation
|
||||
**Database verification:**
|
||||
```bash
|
||||
$ sqlite3 ~/.claude-mem/claude-mem.db "SELECT created_at_epoch FROM observations LIMIT 1"
|
||||
1762212399087 # This is in MILLISECONDS
|
||||
```
|
||||
|
||||
**Comparison breakdown:**
|
||||
- `ninetyDaysAgo` = ~1,754,000,000 (seconds, 10 digits)
|
||||
- `created_at_epoch` = ~1,762,212,399,087 (milliseconds, 13 digits)
|
||||
|
||||
The millisecond value is **ALWAYS** larger than the second value, so the filter `created_at_epoch > ninetyDaysAgo` **ALWAYS** passes, accepting ALL documents regardless of age.
|
||||
|
||||
### Impact
|
||||
- 90-day temporal boundary completely non-functional
|
||||
- Performance degradation (processes all historical data)
|
||||
- Incorrect search results (includes very old observations)
|
||||
- Affects 4 handlers: `search_observations`, `search_sessions`, `search_user_prompts`, `get_timeline_by_query`
|
||||
|
||||
### Recommended Fix
|
||||
Keep milliseconds throughout (remove the `/1000` division):
|
||||
|
||||
**File:** `src/servers/search-server.ts`
|
||||
|
||||
**Find and replace in all 4 hybrid search handlers:**
|
||||
```typescript
|
||||
// OLD (WRONG - converts to seconds)
|
||||
const ninetyDaysAgo = Math.floor(Date.now() / 1000) - (90 * 24 * 60 * 60);
|
||||
|
||||
// NEW (CORRECT - stays in milliseconds)
|
||||
const ninetyDaysAgo = Date.now() - (90 * 24 * 60 * 60 * 1000);
|
||||
```
|
||||
|
||||
**Locations to fix:**
|
||||
1. `search_observations` handler (~line 374)
|
||||
2. `search_sessions` handler
|
||||
3. `search_user_prompts` handler
|
||||
4. `get_timeline_by_query` handler
|
||||
|
||||
### Decision
|
||||
**FIX IMMEDIATELY** - This is a critical bug that breaks core functionality.
|
||||
|
||||
---
|
||||
|
||||
## Issue 3: Chroma Collection Name Mismatch
|
||||
|
||||
**Location:** `src/services/sync/ChromaSync.ts:77-81` and `src/servers/search-server.ts:26`
|
||||
**Priority:** 🟡 Medium
|
||||
**Status:** ⚠️ CURRENTLY WORKS but architectural risk
|
||||
**Severity:** Medium (maintainability issue, potential future breakage)
|
||||
|
||||
### Problem
|
||||
ChromaSync builds collection names as `cm__${project}` (parameterized) while search-server uses a hard-coded `'cm__claude-mem'`, creating maintainability risk.
|
||||
|
||||
**ChromaSync.ts (line 79):**
|
||||
```typescript
|
||||
this.collectionName = `cm__${project}`;
|
||||
```
|
||||
|
||||
**search-server.ts (line 26):**
|
||||
```typescript
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
```
|
||||
|
||||
**worker-service.ts (line 94):**
|
||||
```typescript
|
||||
this.chromaSync = new ChromaSync('claude-mem');
|
||||
```
|
||||
|
||||
### Validation
|
||||
**Current state:** WORKS (both resolve to `'cm__claude-mem'`)
|
||||
**Risk:** If anyone changes the ChromaSync instantiation parameter or creates another instance, collections won't match.
|
||||
|
||||
### Recommended Fix
|
||||
Create a shared constant in a common config location:
|
||||
|
||||
**New file:** `src/shared/config.ts`
|
||||
```typescript
|
||||
export const CHROMA_COLLECTION_NAME = 'cm__claude-mem';
|
||||
// OR for dynamic project support:
|
||||
export function getCollectionName(project: string = 'claude-mem'): string {
|
||||
return `cm__${project}`;
|
||||
}
|
||||
```
|
||||
|
||||
**Update ChromaSync.ts:**
|
||||
```typescript
|
||||
import { CHROMA_COLLECTION_NAME } from '../shared/config';
|
||||
// ...
|
||||
this.collectionName = CHROMA_COLLECTION_NAME;
|
||||
```
|
||||
|
||||
**Update search-server.ts:**
|
||||
```typescript
|
||||
import { CHROMA_COLLECTION_NAME } from '../shared/config';
|
||||
// ...
|
||||
const COLLECTION_NAME = CHROMA_COLLECTION_NAME;
|
||||
```
|
||||
|
||||
### Decision
|
||||
**RECOMMENDED FIX** - Good architectural improvement, prevents future bugs. Not urgent since it currently works, but should be included in the next refactoring pass.
|
||||
|
||||
---
|
||||
|
||||
## Issue 4: doc_type Value Mismatch in ChromaSync
|
||||
|
||||
**Location:** `src/services/sync/ChromaSync.ts:523-532` (read) vs lines 240, 429 (write)
|
||||
**Priority:** 🔴 Critical
|
||||
**Status:** ✅ CONFIRMED - Critical bug, MUST FIX
|
||||
**Severity:** High (breaks deduplication, causes duplicate insert failures)
|
||||
|
||||
### Problem
|
||||
Documents are written with `'session_summary'` and `'user_prompt'` but the deduplication logic looks for `'summary'` and `'prompt'`, causing existing documents to not be detected.
|
||||
|
||||
**Write side (formatSummaryDocs, line 240):**
|
||||
```typescript
|
||||
doc_type: 'session_summary',
|
||||
```
|
||||
|
||||
**Write side (formatUserPromptDoc, line 429):**
|
||||
```typescript
|
||||
doc_type: 'user_prompt',
|
||||
```
|
||||
|
||||
**Read side (getExistingChromaIds, lines 526-529):**
|
||||
```typescript
|
||||
} else if (meta.doc_type === 'summary') {
|
||||
summaryIds.add(meta.sqlite_id);
|
||||
} else if (meta.doc_type === 'prompt') {
|
||||
promptIds.add(meta.sqlite_id);
|
||||
}
|
||||
```
|
||||
|
||||
### Validation
|
||||
Confirmed by code inspection. The mismatch causes:
|
||||
1. `getExistingChromaIds` doesn't find existing summaries/prompts
|
||||
2. They're not added to the deduplication sets
|
||||
3. System tries to insert them again
|
||||
4. Chroma rejects with duplicate ID errors
|
||||
|
||||
### Impact
|
||||
- Deduplication completely broken for summaries and prompts
|
||||
- Backfill operations fail (see Issue 5)
|
||||
- Duplicate insert errors in production
|
||||
- Observations work fine (they use 'observation' consistently)
|
||||
|
||||
### Recommended Fix
|
||||
**PREFERRED APPROACH:** Fix the read side (backward compatible with existing Chroma data)
|
||||
|
||||
**File:** `src/services/sync/ChromaSync.ts`
|
||||
**Lines:** 526-529
|
||||
|
||||
```typescript
|
||||
} else if (meta.doc_type === 'session_summary') { // Changed from 'summary'
|
||||
summaryIds.add(meta.sqlite_id);
|
||||
} else if (meta.doc_type === 'user_prompt') { // Changed from 'prompt'
|
||||
promptIds.add(meta.sqlite_id);
|
||||
}
|
||||
```
|
||||
|
||||
**Why this approach:**
|
||||
- ✅ Backward compatible with existing Chroma data
|
||||
- ✅ No data migration required
|
||||
- ✅ Safer than changing write side
|
||||
- ✅ Works immediately
|
||||
|
||||
**Alternative approach (NOT recommended):** Change write side to use 'summary'/'prompt'
|
||||
- ❌ Requires Chroma data migration
|
||||
- ❌ Orphans existing documents
|
||||
- ❌ Higher risk
|
||||
|
||||
### Decision
|
||||
**FIX IMMEDIATELY** - Critical bug affecting deduplication. Use the backward-compatible fix (change read side).
|
||||
|
||||
---
|
||||
|
||||
## Issue 5: doc_type Mismatch Causing Backfill Failures
|
||||
|
||||
**Location:** `src/services/worker-service.ts:120-128` (manifestation of Issue 4)
|
||||
**Priority:** 🔴 Critical
|
||||
**Status:** ✅ CONFIRMED - Same root cause as Issue 4
|
||||
**Severity:** High (duplicate of Issue 4)
|
||||
|
||||
### Problem
|
||||
Backfill operations fail because of the doc_type mismatch described in Issue 4.
|
||||
|
||||
### Validation
|
||||
This is not a separate bug - it's a **symptom** of Issue 4. The backfill process:
|
||||
1. Queries SQLite for summaries/prompts to sync
|
||||
2. Calls `getExistingChromaIds` to avoid duplicates
|
||||
3. Due to doc_type mismatch, existing IDs aren't found
|
||||
4. Tries to insert documents that already exist
|
||||
5. Chroma rejects with duplicate ID errors
|
||||
6. Backfill fails
|
||||
|
||||
### Decision
|
||||
**AUTOMATICALLY RESOLVED** by fixing Issue 4. Not a separate fix needed.
|
||||
|
||||
---
|
||||
|
||||
## Summary & Action Plan
|
||||
|
||||
### Critical Issues (Fix Immediately)
|
||||
1. ✅ **Issue 2** - 90-day units mismatch
|
||||
- Fix: Change all 4 handlers to use milliseconds
|
||||
- Impact: Restores temporal filtering functionality
|
||||
|
||||
2. ✅ **Issue 4** - doc_type mismatch
|
||||
- Fix: Change getExistingChromaIds to use 'session_summary'/'user_prompt'
|
||||
- Impact: Fixes deduplication and backfill
|
||||
|
||||
3. ✅ **Issue 5** - Automatically resolved by fixing Issue 4
|
||||
|
||||
### Medium Priority (Include in Next Refactor)
|
||||
4. ⚠️ **Issue 3** - Collection name consistency
|
||||
- Fix: Create shared constant
|
||||
- Impact: Better maintainability, prevents future bugs
|
||||
|
||||
### Low Priority (Defer)
|
||||
5. 🟢 **Issue 1** - False positives in experiment
|
||||
- Fix: Parse and validate arrays
|
||||
- Impact: More accurate test results (experiment only)
|
||||
|
||||
### Files Requiring Changes
|
||||
|
||||
**High Priority:**
|
||||
- `src/servers/search-server.ts` (Issue 2 - 4 locations)
|
||||
- `src/services/sync/ChromaSync.ts` (Issue 4 - lines 526-529)
|
||||
|
||||
**Medium Priority:**
|
||||
- `src/shared/config.ts` (Issue 3 - new file)
|
||||
- `src/services/sync/ChromaSync.ts` (Issue 3 - import)
|
||||
- `src/servers/search-server.ts` (Issue 3 - import)
|
||||
|
||||
**Low Priority:**
|
||||
- `experiment/chroma-search-test.ts` (Issue 1)
|
||||
|
||||
### Testing Recommendations
|
||||
After fixes:
|
||||
1. Test 90-day filtering with dates before/after cutoff
|
||||
2. Run backfill operation to verify deduplication
|
||||
3. Verify no duplicate ID errors in logs
|
||||
4. Test hybrid search with temporal boundaries
|
||||
+2
-35
@@ -9,44 +9,11 @@
|
||||
* pm2 status
|
||||
*/
|
||||
|
||||
const os = require('os');
|
||||
const path = require('path');
|
||||
|
||||
// Determine log directory
|
||||
const logDir = path.join(os.homedir(), '.claude-mem', 'logs');
|
||||
|
||||
module.exports = {
|
||||
apps: [{
|
||||
name: 'claude-mem-worker',
|
||||
script: './plugin/scripts/worker-service.cjs',
|
||||
interpreter: 'node',
|
||||
instances: 1,
|
||||
exec_mode: 'fork',
|
||||
autorestart: true,
|
||||
watch: false,
|
||||
|
||||
env: {
|
||||
NODE_ENV: 'production',
|
||||
CLAUDE_MEM_WORKER_PORT: 37777, // Fixed port for reliability
|
||||
FORCE_COLOR: '1'
|
||||
},
|
||||
|
||||
// Logging
|
||||
error_file: path.join(logDir, 'worker-error.log'),
|
||||
out_file: path.join(logDir, 'worker-out.log'),
|
||||
log_date_format: 'YYYY-MM-DD HH:mm:ss.SSS Z',
|
||||
merge_logs: true,
|
||||
|
||||
// Keep logs from last 7 days
|
||||
log_type: 'json',
|
||||
|
||||
// Process management
|
||||
kill_timeout: 1000,
|
||||
listen_timeout: 3000,
|
||||
shutdown_with_message: true,
|
||||
|
||||
// PM2 Plus (optional monitoring)
|
||||
// instance_var: 'INSTANCE_ID',
|
||||
// pmx: true
|
||||
error_file: '/dev/null',
|
||||
out_file: '/dev/null'
|
||||
}]
|
||||
};
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as O}from"process";import W from"better-sqlite3";import{join as _,dirname as M,basename as K}from"path";import{homedir as L}from"os";import{existsSync as Q,mkdirSync as X}from"fs";import{fileURLToPath as F}from"url";function B(){return typeof __dirname<"u"?__dirname:M(F(import.meta.url))}var P=B(),u=process.env.CLAUDE_MEM_DATA_DIR||_(L(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||_(L(),".claude"),Z=_(u,"archives"),ee=_(u,"logs"),se=_(u,"trash"),te=_(u,"backups"),re=_(u,"settings.json"),A=_(u,"claude-mem.db"),ne=_(u,"vector-db"),oe=_(R,"settings.json"),ie=_(R,"commands"),ae=_(R,"CLAUDE.md");function C(c){X(c,{recursive:!0})}function v(){return _(P,"..","..")}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let m="";n!=null&&(this.level===0&&typeof n=="object"?m=`
|
||||
`+JSON.stringify(n,null,2):m=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:p,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([U,w])=>`${U}=${w}`).join(", ")}}`)}let b=`[${o}] [${i}] [${d}] ${E}${t}${T}${m}`;e===3?console.error(b):console.log(b)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},y=new N;var g=class{db;constructor(){C(u),this.db=new W(A),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as I}from"process";import M from"better-sqlite3";import{join as E,dirname as y,basename as F}from"path";import{homedir as O}from"os";import{existsSync as $,mkdirSync as k}from"fs";import{fileURLToPath as x}from"url";function U(){return typeof __dirname<"u"?__dirname:y(x(import.meta.url))}var P=U(),u=process.env.CLAUDE_MEM_DATA_DIR||E(O(),".claude-mem"),R=process.env.CLAUDE_CONFIG_DIR||E(O(),".claude"),W=E(u,"archives"),Y=E(u,"logs"),K=E(u,"trash"),V=E(u,"backups"),q=E(u,"settings.json"),f=E(u,"claude-mem.db"),J=E(u,"vector-db"),Q=E(R,"settings.json"),z=E(R,"commands"),Z=E(R,"CLAUDE.md");function L(p){k(p,{recursive:!0})}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),h=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),d=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let m="";n!=null&&(this.level===0&&typeof n=="object"?m=`
|
||||
`+JSON.stringify(n,null,2):m=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:c,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([v,D])=>`${v}=${D}`).join(", ")}}`)}let b=`[${o}] [${i}] [${d}] ${_}${t}${T}${m}`;e===3?console.error(b):console.log(b)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new h;var g=class{db;constructor(){L(u),this.db=new M(f),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -269,7 +269,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(y.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(A.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -331,7 +331,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,E;if(e!==null){let l=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,_;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${o}
|
||||
@@ -343,7 +343,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let p=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,E=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary observations:",p.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
`;try{let c=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
@@ -355,7 +355,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let p=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(p.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=p.length>0?p[p.length-1].created_at_epoch:s,E=a.length>0?a[a.length-1].created_at_epoch:s}catch(p){return console.error("[SessionStore] Error getting boundary timestamps:",p.message),{observations:[],sessions:[],prompts:[]}}}let m=`
|
||||
`;try{let c=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let m=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
@@ -371,5 +371,5 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let l=this.db.prepare(m).all(d,E,...i),S=this.db.prepare(T).all(d,E,...i),p=this.db.prepare(b).all(d,E,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:p.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};import f from"path";import{existsSync as I}from"fs";import{spawn as H}from"child_process";var $=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),G=`http://127.0.0.1:${$}/health`;async function D(){try{return(await fetch(G,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function k(){try{if(await D())return!0;console.error("[claude-mem] Worker not responding, starting...");let c=v(),e=f.join(c,"plugin","scripts","worker-service.cjs");if(!I(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=f.join(c,"ecosystem.config.cjs"),t=f.join(c,"node_modules",".bin","pm2");if(!I(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!I(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=H(t,["start",s],{detached:!0,stdio:"ignore",cwd:c});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(o=>setTimeout(o,500)),await D())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(c){return console.error(`[claude-mem] Failed to start worker: ${c.message}`),!1}}async function x(c){console.error("[claude-mem cleanup] Hook fired",{input:c?{session_id:c.session_id,cwd:c.cwd,reason:c.reason}:null}),c||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=c;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s}),await k()||console.error("[claude-mem cleanup] Worker not available - skipping HTTP cleanup");let r=new g,n=r.findActiveSDKSession(e);n||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),r.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:n.id,sdk_session_id:n.sdk_session_id,project:n.project,worker_port:n.worker_port}),r.markSessionCompleted(n.id),console.error("[claude-mem cleanup] Session marked as completed in database"),r.close(),console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(O.isTTY)x(void 0);else{let c="";O.on("data",e=>c+=e),O.on("end",async()=>{let e=c?JSON.parse(c):void 0;await x(e)})}
|
||||
`;try{let l=this.db.prepare(m).all(d,_,...i),S=this.db.prepare(T).all(d,_,...i),c=this.db.prepare(b).all(d,_,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:c.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};async function C(p){console.error("[claude-mem cleanup] Hook fired",{input:p?{session_id:p.session_id,cwd:p.cwd,reason:p.reason}:null}),p||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=p;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s});let t=new g,r=t.findActiveSDKSession(e);r||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),t.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:r.id,sdk_session_id:r.sdk_session_id,project:r.project,worker_port:r.worker_port}),t.markSessionCompleted(r.id),console.error("[claude-mem cleanup] Session marked as completed in database"),t.close(),console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(I.isTTY)C(void 0);else{let p="";I.on("data",e=>p+=e),I.on("end",async()=>{let e=p?JSON.parse(p):void 0;await C(e)})}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import P from"path";import{stdin as F}from"process";import ae from"better-sqlite3";import{join as b,dirname as te,basename as fe}from"path";import{homedir as H}from"os";import{existsSync as Oe,mkdirSync as re}from"fs";import{fileURLToPath as ne}from"url";function oe(){return typeof __dirname<"u"?__dirname:te(ne(import.meta.url))}var ie=oe(),I=process.env.CLAUDE_MEM_DATA_DIR||b(H(),".claude-mem"),$=process.env.CLAUDE_CONFIG_DIR||b(H(),".claude"),Le=b(I,"archives"),ye=b(I,"logs"),ve=b(I,"trash"),Ae=b(I,"backups"),Ce=b(I,"settings.json"),j=b(I,"claude-mem.db"),De=b(I,"vector-db"),ke=b($,"settings.json"),xe=b($,"commands"),$e=b($,"CLAUDE.md");function G(d){re(d,{recursive:!0})}function Y(){return b(ie,"..","..")}var U=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(U||{}),w=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=U[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let a=new Date().toISOString().replace("T"," ").substring(0,23),c=U[e].padEnd(5),u=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let f="";n!=null&&(this.level===0&&typeof n=="object"?f=`
|
||||
`+JSON.stringify(n,null,2):f=" "+this.formatData(n));let o="";if(r){let{sessionId:S,sdkSessionId:N,correlationId:m,...p}=r;Object.keys(p).length>0&&(o=` {${Object.entries(p).map(([_,T])=>`${_}=${T}`).join(", ")}}`)}let y=`[${a}] [${c}] [${u}] ${E}${t}${o}${f}`;e===3?console.error(y):console.log(y)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},V=new w;var D=class{db;constructor(){G(I),this.db=new ae(j),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import X from"path";import{stdin as w}from"process";import se from"better-sqlite3";import{join as S,dirname as Q,basename as _e}from"path";import{homedir as P}from"os";import{existsSync as Ee,mkdirSync as z}from"fs";import{fileURLToPath as Z}from"url";function ee(){return typeof __dirname<"u"?__dirname:Q(Z(import.meta.url))}var he=ee(),I=process.env.CLAUDE_MEM_DATA_DIR||S(P(),".claude-mem"),$=process.env.CLAUDE_CONFIG_DIR||S(P(),".claude"),ge=S(I,"archives"),be=S(I,"logs"),Se=S(I,"trash"),fe=S(I,"backups"),Re=S(I,"settings.json"),H=S(I,"claude-mem.db"),Ne=S(I,"vector-db"),Oe=S($,"settings.json"),Ie=S($,"commands"),Le=S($,"CLAUDE.md");function G(p){z(p,{recursive:!0})}var U=(i=>(i[i.DEBUG=0]="DEBUG",i[i.INFO=1]="INFO",i[i.WARN=2]="WARN",i[i.ERROR=3]="ERROR",i[i.SILENT=4]="SILENT",i))(U||{}),M=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=U[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,i){if(e<this.level)return;let d=new Date().toISOString().replace("T"," ").substring(0,23),a=U[e].padEnd(5),_=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let b="";i!=null&&(this.level===0&&typeof i=="object"?b=`
|
||||
`+JSON.stringify(i,null,2):b=" "+this.formatData(i));let n="";if(r){let{sessionId:f,sdkSessionId:N,correlationId:l,...c}=r;Object.keys(c).length>0&&(n=` {${Object.entries(c).map(([u,T])=>`${u}=${T}`).join(", ")}}`)}let v=`[${d}] [${a}] [${_}] ${E}${t}${n}${b}`;e===3?console.error(v):console.log(v)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},W=new M;var D=class{db;constructor(){G(I),this.db=new se(H),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(u=>u.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(u=>u.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(u=>u.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(_=>_.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(_=>_.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(_=>_.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -214,12 +214,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",a=r?`LIMIT ${r}`:"",c=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,i=t==="date_asc"?"ASC":"DESC",d=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${c})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${a}
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${i}
|
||||
${d}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
@@ -232,7 +232,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,n=new Set;for(let a of t){if(a.files_read)try{let c=JSON.parse(a.files_read);Array.isArray(c)&&c.forEach(u=>r.add(u))}catch{}if(a.files_modified)try{let c=JSON.parse(a.files_modified);Array.isArray(c)&&c.forEach(u=>n.add(u))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,i=new Set;for(let d of t){if(d.files_read)try{let a=JSON.parse(d.files_read);Array.isArray(a)&&a.forEach(_=>r.add(_))}catch{}if(d.files_modified)try{let a=JSON.parse(d.files_modified);Array.isArray(a)&&a.forEach(_=>i.add(_))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(i)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -259,17 +259,17 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,n=r.getTime(),c=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,i=r.getTime(),a=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,t,r.toISOString(),n);return c.lastInsertRowid===0||c.changes===0?this.db.prepare(`
|
||||
`).run(e,e,s,t,r.toISOString(),i);return a.lastInsertRowid===0||a.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:c.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
`).get(e).id:a.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(V.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(W.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -278,33 +278,33 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,n=r.getTime();return this.db.prepare(`
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,i=r.getTime();return this.db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`).run(e,s,t,r.toISOString(),n).lastInsertRowid}storeObservation(e,s,t,r){let n=new Date,a=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t,r.toISOString(),i).lastInsertRowid}storeObservation(e,s,t,r){let i=new Date,d=i.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),a),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let f=this.db.prepare(`
|
||||
`).run(e,e,s,i.toISOString(),d),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let b=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),a);return{id:Number(f.lastInsertRowid),createdAtEpoch:a}}storeSummary(e,s,t,r){let n=new Date,a=n.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,i.toISOString(),d);return{id:Number(b.lastInsertRowid),createdAtEpoch:d}}storeSummary(e,s,t,r){let i=new Date,d=i.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,n.toISOString(),a),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let f=this.db.prepare(`
|
||||
`).run(e,e,s,i.toISOString(),d),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let b=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),a);return{id:Number(f.lastInsertRowid),createdAtEpoch:a}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,i.toISOString(),d);return{id:Number(b.lastInsertRowid),createdAtEpoch:d}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -316,62 +316,62 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",a=r?`LIMIT ${r}`:"",c=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,i=t==="date_asc"?"ASC":"DESC",d=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
WHERE id IN (${c})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${a}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",a=r?`LIMIT ${r}`:"",c=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${i}
|
||||
${d}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,i=t==="date_asc"?"ASC":"DESC",d=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.id IN (${c})
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${a}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let a=n?"AND project = ?":"",c=n?[n]:[],u,E;if(e!==null){let S=`
|
||||
WHERE up.id IN (${a})
|
||||
ORDER BY up.created_at_epoch ${i}
|
||||
${d}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,i){let d=i?"AND project = ?":"",a=i?[i]:[],_,E;if(e!==null){let f=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${a}
|
||||
WHERE id <= ? ${d}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,N=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${a}
|
||||
WHERE id >= ? ${d}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let m=this.db.prepare(S).all(e,...c,t+1),p=this.db.prepare(N).all(e,...c,r+1);if(m.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};u=m.length>0?m[m.length-1].created_at_epoch:s,E=p.length>0?p[p.length-1].created_at_epoch:s}catch(m){return console.error("[SessionStore] Error getting boundary observations:",m.message),{observations:[],sessions:[],prompts:[]}}}else{let S=`
|
||||
`;try{let l=this.db.prepare(f).all(e,...a,t+1),c=this.db.prepare(N).all(e,...a,r+1);if(l.length===0&&c.length===0)return{observations:[],sessions:[],prompts:[]};_=l.length>0?l[l.length-1].created_at_epoch:s,E=c.length>0?c[c.length-1].created_at_epoch:s}catch(l){return console.error("[SessionStore] Error getting boundary observations:",l.message),{observations:[],sessions:[],prompts:[]}}}else{let f=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${a}
|
||||
WHERE created_at_epoch <= ? ${d}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,N=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${a}
|
||||
WHERE created_at_epoch >= ? ${d}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let m=this.db.prepare(S).all(s,...c,t),p=this.db.prepare(N).all(s,...c,r+1);if(m.length===0&&p.length===0)return{observations:[],sessions:[],prompts:[]};u=m.length>0?m[m.length-1].created_at_epoch:s,E=p.length>0?p[p.length-1].created_at_epoch:s}catch(m){return console.error("[SessionStore] Error getting boundary timestamps:",m.message),{observations:[],sessions:[],prompts:[]}}}let f=`
|
||||
`;try{let l=this.db.prepare(f).all(s,...a,t),c=this.db.prepare(N).all(s,...a,r+1);if(l.length===0&&c.length===0)return{observations:[],sessions:[],prompts:[]};_=l.length>0?l[l.length-1].created_at_epoch:s,E=c.length>0?c[c.length-1].created_at_epoch:s}catch(l){return console.error("[SessionStore] Error getting boundary timestamps:",l.message),{observations:[],sessions:[],prompts:[]}}}let b=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${a}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${d}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,o=`
|
||||
`,n=`
|
||||
SELECT *
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${a}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${d}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,y=`
|
||||
`,v=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${a.replace("project","s.project")}
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${d.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let S=this.db.prepare(f).all(u,E,...c),N=this.db.prepare(o).all(u,E,...c),m=this.db.prepare(y).all(u,E,...c);return{observations:S,sessions:N.map(p=>({id:p.id,sdk_session_id:p.sdk_session_id,project:p.project,request:p.request,completed:p.completed,next_steps:p.next_steps,created_at:p.created_at,created_at_epoch:p.created_at_epoch})),prompts:m.map(p=>({id:p.id,claude_session_id:p.claude_session_id,project:p.project,prompt:p.prompt_text,created_at:p.created_at,created_at_epoch:p.created_at_epoch}))}}catch(S){return console.error("[SessionStore] Error querying timeline records:",S.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};import M from"path";import{existsSync as X}from"fs";import{spawn as de}from"child_process";var ce=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),pe=`http://127.0.0.1:${ce}/health`;async function K(){try{return(await fetch(pe,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function q(){try{if(await K())return!0;console.error("[claude-mem] Worker not responding, starting...");let d=Y(),e=M.join(d,"plugin","scripts","worker-service.cjs");if(!X(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=M.join(d,"ecosystem.config.cjs"),t=M.join(d,"node_modules",".bin","pm2");if(!X(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!X(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=de(t,["start",s],{detached:!0,stdio:"ignore",cwd:d});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(a=>setTimeout(a,500)),await K())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(d){return console.error(`[claude-mem] Failed to start worker: ${d.message}`),!1}}var ue=parseInt(process.env.CLAUDE_MEM_CONTEXT_OBSERVATIONS||"50",10),J=10,i={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function _e(d){if(!d)return[];let e=JSON.parse(d);return Array.isArray(e)?e:[]}function me(d){return new Date(d).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function le(d){return new Date(d).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function Ee(d){return new Date(d).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function Te(d){return d?Math.ceil(d.length/4):0}function he(d,e){return P.isAbsolute(d)?P.relative(e,d):d}function Q(d,e=!1,s=!1){q();let t=d?.cwd??process.cwd(),r=t?P.basename(t):"unknown-project",n=new D,a=n.db.prepare(`
|
||||
`;try{let f=this.db.prepare(b).all(_,E,...a),N=this.db.prepare(n).all(_,E,...a),l=this.db.prepare(v).all(_,E,...a);return{observations:f,sessions:N.map(c=>({id:c.id,sdk_session_id:c.sdk_session_id,project:c.project,request:c.request,completed:c.completed,next_steps:c.next_steps,created_at:c.created_at,created_at_epoch:c.created_at_epoch})),prompts:l.map(c=>({id:c.id,claude_session_id:c.claude_session_id,project:c.project,prompt:c.prompt_text,created_at:c.created_at,created_at_epoch:c.created_at_epoch}))}}catch(f){return console.error("[SessionStore] Error querying timeline records:",f.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};var te=parseInt(process.env.CLAUDE_MEM_CONTEXT_OBSERVATIONS||"50",10),j=10,o={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function re(p){if(!p)return[];let e=JSON.parse(p);return Array.isArray(e)?e:[]}function ne(p){return new Date(p).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function ie(p){return new Date(p).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function oe(p){return new Date(p).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function ae(p){return p?Math.ceil(p.length/4):0}function de(p,e){return X.isAbsolute(p)?X.relative(e,p):p}function Y(p,e=!1,s=!1){let t=p?.cwd??process.cwd(),r=t?X.basename(t):"unknown-project",i=new D,d=i.db.prepare(`
|
||||
SELECT
|
||||
id, sdk_session_id, type, title, subtitle, narrative,
|
||||
facts, concepts, files_read, files_modified,
|
||||
@@ -380,18 +380,18 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(r,ue),c=n.db.prepare(`
|
||||
`).all(r,te),a=i.db.prepare(`
|
||||
SELECT id, sdk_session_id, request, completed, next_steps, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(r,J+1);if(a.length===0&&c.length===0)return n.close(),e?`
|
||||
${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}
|
||||
${i.gray}${"\u2500".repeat(60)}${i.reset}
|
||||
`).all(r,j+1);if(d.length===0&&a.length===0)return i.close(),e?`
|
||||
${o.bright}${o.cyan}\u{1F4DD} [${r}] recent context${o.reset}
|
||||
${o.gray}${"\u2500".repeat(60)}${o.reset}
|
||||
|
||||
${i.dim}No previous sessions found for this project yet.${i.reset}
|
||||
${o.dim}No previous sessions found for this project yet.${o.reset}
|
||||
`:`# [${r}] recent context
|
||||
|
||||
No previous sessions found for this project yet.`;let u=a,E=c.slice(0,J),f=u,o=[];if(e?(o.push(""),o.push(`${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}`),o.push(`${i.gray}${"\u2500".repeat(60)}${i.reset}`),o.push("")):(o.push(`# [${r}] recent context`),o.push("")),f.length>0){e?(o.push(`${i.dim}Legend: \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision${i.reset}`),o.push("")):(o.push("**Legend:** \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision"),o.push("")),e?(o.push(`${i.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${i.reset}`),o.push(`${i.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${i.reset}`),o.push(`${i.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${i.reset}`),o.push(`${i.dim} \u2192 Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately${i.reset}`),o.push("")):(o.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),o.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),o.push("- Prefer searching observations over re-reading code for past decisions and learnings"),o.push("- Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately"),o.push(""));let y=c[0]?.id,S=E.map((_,T)=>{let l=T===0?null:c[T+1];return{..._,displayEpoch:l?l.created_at_epoch:_.created_at_epoch,displayTime:l?l.created_at:_.created_at,isMostRecent:_.id===y}}),N=[...f.map(_=>({type:"observation",data:_})),...S.map(_=>({type:"summary",data:_}))];N.sort((_,T)=>{let l=_.type==="observation"?_.data.created_at_epoch:_.data.displayEpoch,L=T.type==="observation"?T.data.created_at_epoch:T.data.displayEpoch;return l-L});let m=new Map;for(let _ of N){let T=_.type==="observation"?_.data.created_at:_.data.displayTime,l=Ee(T);m.has(l)||m.set(l,[]),m.get(l).push(_)}let p=Array.from(m.entries()).sort((_,T)=>{let l=new Date(_[0]).getTime(),L=new Date(T[0]).getTime();return l-L});for(let[_,T]of p){e?(o.push(`${i.bright}${i.cyan}${_}${i.reset}`),o.push("")):(o.push(`### ${_}`),o.push(""));let l=null,L="",v=!1;for(let k of T)if(k.type==="summary"){v&&(o.push(""),v=!1,l=null,L="");let h=k.data,A=`${h.request||"Session started"} (${me(h.displayTime)})`,O=h.isMostRecent?"":`claude-mem://session-summary/${h.id}`;if(e){let g=O?`${i.dim}[${O}]${i.reset}`:"";o.push(`\u{1F3AF} ${i.yellow}#S${h.id}${i.reset} ${A} ${g}`)}else{let g=O?` [\u2192](${O})`:"";o.push(`**\u{1F3AF} #S${h.id}** ${A}${g}`)}o.push("")}else{let h=k.data,A=_e(h.files_modified),O=A.length>0?he(A[0],t):"General";O!==l&&(v&&o.push(""),e?o.push(`${i.dim}${O}${i.reset}`):o.push(`**${O}**`),e||(o.push("| ID | Time | T | Title | Tokens |"),o.push("|----|------|---|-------|--------|")),l=O,v=!0,L="");let g="\u2022";switch(h.type){case"bugfix":g="\u{1F534}";break;case"feature":g="\u{1F7E3}";break;case"refactor":g="\u{1F504}";break;case"change":g="\u2705";break;case"discovery":g="\u{1F535}";break;case"decision":g="\u{1F9E0}";break;default:g="\u2022"}let C=le(h.created_at),B=h.title||"Untitled",x=Te(h.narrative),W=C!==L,Z=W?C:"";if(L=C,e){let ee=W?`${i.dim}${C}${i.reset}`:" ".repeat(C.length),se=x>0?`${i.dim}(~${x}t)${i.reset}`:"";o.push(` ${i.dim}#${h.id}${i.reset} ${ee} ${g} ${B} ${se}`)}else o.push(`| #${h.id} | ${Z||"\u2033"} | ${g} | ${B} | ~${x} |`)}v&&o.push("")}let R=c[0];R&&(R.completed||R.next_steps)&&(R.completed&&(e?o.push(`${i.green}Completed:${i.reset} ${R.completed}`):o.push(`**Completed**: ${R.completed}`),o.push("")),R.next_steps&&(e?o.push(`${i.magenta}Next Steps:${i.reset} ${R.next_steps}`):o.push(`**Next Steps**: ${R.next_steps}`),o.push(""))),e?o.push(`${i.dim}Use claude-mem MCP search to access records with the given ID${i.reset}`):o.push("*Use claude-mem MCP search to access records with the given ID*")}return n.close(),o.join(`
|
||||
`).trimEnd()}var z=process.argv.includes("--index"),ge=process.argv.includes("--colors");if(F.isTTY||ge){let d=Q(void 0,!0,z);console.log(d),process.exit(0)}else{let d="";F.on("data",e=>d+=e),F.on("end",()=>{let e=d.trim()?JSON.parse(d):void 0,t={hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:Q(e,!1,z)}};console.log(JSON.stringify(t)),process.exit(0)})}
|
||||
No previous sessions found for this project yet.`;let _=d,E=a.slice(0,j),b=_,n=[];if(e?(n.push(""),n.push(`${o.bright}${o.cyan}\u{1F4DD} [${r}] recent context${o.reset}`),n.push(`${o.gray}${"\u2500".repeat(60)}${o.reset}`),n.push("")):(n.push(`# [${r}] recent context`),n.push("")),b.length>0){e?(n.push(`${o.dim}Legend: \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision${o.reset}`),n.push("")):(n.push("**Legend:** \u{1F3AF} session-request | \u{1F534} bugfix | \u{1F7E3} feature | \u{1F504} refactor | \u2705 change | \u{1F535} discovery | \u{1F9E0} decision"),n.push("")),e?(n.push(`${o.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${o.reset}`),n.push(`${o.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${o.reset}`),n.push(`${o.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${o.reset}`),n.push(`${o.dim} \u2192 Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately${o.reset}`),n.push("")):(n.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),n.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),n.push("- Prefer searching observations over re-reading code for past decisions and learnings"),n.push("- Critical types (\u{1F534} bugfix, \u{1F9E0} decision) often worth fetching immediately"),n.push(""));let v=a[0]?.id,f=E.map((u,T)=>{let m=T===0?null:a[T+1];return{...u,displayEpoch:m?m.created_at_epoch:u.created_at_epoch,displayTime:m?m.created_at:u.created_at,isMostRecent:u.id===v}}),N=[...b.map(u=>({type:"observation",data:u})),...f.map(u=>({type:"summary",data:u}))];N.sort((u,T)=>{let m=u.type==="observation"?u.data.created_at_epoch:u.data.displayEpoch,L=T.type==="observation"?T.data.created_at_epoch:T.data.displayEpoch;return m-L});let l=new Map;for(let u of N){let T=u.type==="observation"?u.data.created_at:u.data.displayTime,m=oe(T);l.has(m)||l.set(m,[]),l.get(m).push(u)}let c=Array.from(l.entries()).sort((u,T)=>{let m=new Date(u[0]).getTime(),L=new Date(T[0]).getTime();return m-L});for(let[u,T]of c){e?(n.push(`${o.bright}${o.cyan}${u}${o.reset}`),n.push("")):(n.push(`### ${u}`),n.push(""));let m=null,L="",A=!1;for(let x of T)if(x.type==="summary"){A&&(n.push(""),A=!1,m=null,L="");let h=x.data,y=`${h.request||"Session started"} (${ne(h.displayTime)})`,O=h.isMostRecent?"":`claude-mem://session-summary/${h.id}`;if(e){let g=O?`${o.dim}[${O}]${o.reset}`:"";n.push(`\u{1F3AF} ${o.yellow}#S${h.id}${o.reset} ${y} ${g}`)}else{let g=O?` [\u2192](${O})`:"";n.push(`**\u{1F3AF} #S${h.id}** ${y}${g}`)}n.push("")}else{let h=x.data,y=re(h.files_modified),O=y.length>0?de(y[0],t):"General";O!==m&&(A&&n.push(""),e?n.push(`${o.dim}${O}${o.reset}`):n.push(`**${O}**`),e||(n.push("| ID | Time | T | Title | Tokens |"),n.push("|----|------|---|-------|--------|")),m=O,A=!0,L="");let g="\u2022";switch(h.type){case"bugfix":g="\u{1F534}";break;case"feature":g="\u{1F7E3}";break;case"refactor":g="\u{1F504}";break;case"change":g="\u2705";break;case"discovery":g="\u{1F535}";break;case"decision":g="\u{1F9E0}";break;default:g="\u2022"}let C=ie(h.created_at),F=h.title||"Untitled",k=ae(h.narrative),B=C!==L,q=B?C:"";if(L=C,e){let K=B?`${o.dim}${C}${o.reset}`:" ".repeat(C.length),J=k>0?`${o.dim}(~${k}t)${o.reset}`:"";n.push(` ${o.dim}#${h.id}${o.reset} ${K} ${g} ${F} ${J}`)}else n.push(`| #${h.id} | ${q||"\u2033"} | ${g} | ${F} | ~${k} |`)}A&&n.push("")}let R=a[0];R&&(R.completed||R.next_steps)&&(R.completed&&(e?n.push(`${o.green}Completed:${o.reset} ${R.completed}`):n.push(`**Completed**: ${R.completed}`),n.push("")),R.next_steps&&(e?n.push(`${o.magenta}Next Steps:${o.reset} ${R.next_steps}`):n.push(`**Next Steps**: ${R.next_steps}`),n.push(""))),e?n.push(`${o.dim}Use claude-mem MCP search to access records with the given ID${o.reset}`):n.push("*Use claude-mem MCP search to access records with the given ID*")}return i.close(),n.join(`
|
||||
`).trimEnd()}var V=process.argv.includes("--index"),ce=process.argv.includes("--colors");if(w.isTTY||ce){let p=Y(void 0,!0,V);console.log(p),process.exit(0)}else{let p="";w.on("data",e=>p+=e),w.on("end",()=>{let e=p.trim()?JSON.parse(p):void 0,t={hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:Y(e,!1,V)}};console.log(JSON.stringify(t)),process.exit(0)})}
|
||||
|
||||
+35
-35
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import V from"path";import{stdin as M}from"process";import G from"better-sqlite3";import{join as E,dirname as P,basename as z}from"path";import{homedir as L}from"os";import{existsSync as te,mkdirSync as H}from"fs";import{fileURLToPath as B}from"url";function $(){return typeof __dirname<"u"?__dirname:P(B(import.meta.url))}var W=$(),m=process.env.CLAUDE_MEM_DATA_DIR||E(L(),".claude-mem"),g=process.env.CLAUDE_CONFIG_DIR||E(L(),".claude"),oe=E(m,"archives"),ne=E(m,"logs"),ie=E(m,"trash"),ae=E(m,"backups"),de=E(m,"settings.json"),A=E(m,"claude-mem.db"),pe=E(m,"vector-db"),ce=E(g,"settings.json"),_e=E(g,"commands"),ue=E(g,"CLAUDE.md");function C(d){H(d,{recursive:!0})}function v(){return E(W,"..","..")}var h=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(h||{}),N=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let n=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),p=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let u="";o!=null&&(this.level===0&&typeof o=="object"?u=`
|
||||
`+JSON.stringify(o,null,2):u=" "+this.formatData(o));let l="";if(r){let{sessionId:T,sdkSessionId:S,correlationId:c,...a}=r;Object.keys(a).length>0&&(l=` {${Object.entries(a).map(([X,F])=>`${X}=${F}`).join(", ")}}`)}let b=`[${n}] [${i}] [${p}] ${_}${t}${l}${u}`;e===3?console.error(b):console.log(b)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},y=new N;var R=class{db;constructor(){C(m),this.db=new G(A),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import F from"path";import{stdin as v}from"process";import w from"better-sqlite3";import{join as E,dirname as k,basename as $}from"path";import{homedir as I}from"os";import{existsSync as Y,mkdirSync as x}from"fs";import{fileURLToPath as U}from"url";function M(){return typeof __dirname<"u"?__dirname:k(U(import.meta.url))}var V=M(),m=process.env.CLAUDE_MEM_DATA_DIR||E(I(),".claude-mem"),g=process.env.CLAUDE_CONFIG_DIR||E(I(),".claude"),q=E(m,"archives"),J=E(m,"logs"),Q=E(m,"trash"),z=E(m,"backups"),Z=E(m,"settings.json"),f=E(m,"claude-mem.db"),ee=E(m,"vector-db"),se=E(g,"settings.json"),te=E(g,"commands"),re=E(g,"CLAUDE.md");function L(c){x(c,{recursive:!0})}var N=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(N||{}),h=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),d=s.padEnd(6),p="";r?.correlationId?p=`[${r.correlationId}] `:r?.sessionId&&(p=`[session-${r.sessionId}] `);let u="";n!=null&&(this.level===0&&typeof n=="object"?u=`
|
||||
`+JSON.stringify(n,null,2):u=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:_,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([D,y])=>`${D}=${y}`).join(", ")}}`)}let b=`[${o}] [${i}] [${d}] ${p}${t}${T}${u}`;e===3?console.error(b):console.log(b)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new h;var R=class{db;constructor(){L(m),this.db=new w(f),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(p=>p.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(d=>d.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -214,12 +214,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${n}
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
@@ -232,7 +232,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,o=new Set;for(let n of t){if(n.files_read)try{let i=JSON.parse(n.files_read);Array.isArray(i)&&i.forEach(p=>r.add(p))}catch{}if(n.files_modified)try{let i=JSON.parse(n.files_modified);Array.isArray(i)&&i.forEach(p=>o.add(p))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,n=new Set;for(let o of t){if(o.files_read)try{let i=JSON.parse(o.files_read);Array.isArray(i)&&i.forEach(d=>r.add(d))}catch{}if(o.files_modified)try{let i=JSON.parse(o.files_modified);Array.isArray(i)&&i.forEach(d=>n.add(d))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -259,17 +259,17 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,o=r.getTime(),i=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,n=r.getTime(),i=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,t,r.toISOString(),o);return i.lastInsertRowid===0||i.changes===0?this.db.prepare(`
|
||||
`).run(e,e,s,t,r.toISOString(),n);return i.lastInsertRowid===0||i.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:i.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(y.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(A.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -278,33 +278,33 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,o=r.getTime();return this.db.prepare(`
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,n=r.getTime();return this.db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`).run(e,s,t,r.toISOString(),o).lastInsertRowid}storeObservation(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t,r.toISOString(),n).lastInsertRowid}storeObservation(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),n);return{id:Number(u.lastInsertRowid),createdAtEpoch:n}}storeSummary(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),n);return{id:Number(u.lastInsertRowid),createdAtEpoch:n}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -316,12 +316,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${n}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
@@ -329,46 +329,46 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${o}
|
||||
${n}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let n=o?"AND project = ?":"",i=o?[o]:[],p,_;if(e!==null){let T=`
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,p;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${n}
|
||||
WHERE id <= ? ${o}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,S=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${n}
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(T).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let T=`
|
||||
`;try{let _=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,p=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary observations:",_.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,S=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${n}
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(T).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let u=`
|
||||
`;try{let _=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,p=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary timestamps:",_.message),{observations:[],sessions:[],prompts:[]}}}let u=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,l=`
|
||||
`,T=`
|
||||
SELECT *
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,b=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${n.replace("project","s.project")}
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let T=this.db.prepare(u).all(p,_,...i),S=this.db.prepare(l).all(p,_,...i),c=this.db.prepare(b).all(p,_,...i);return{observations:T,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:c.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(T){return console.error("[SessionStore] Error querying timeline records:",T.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function j(d,e,s){return d==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:d==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:d==="UserPromptSubmit"||d==="PostToolUse"?{continue:!0,suppressOutput:!0}:d==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function D(d,e,s={}){let t=j(d,e,s);return JSON.stringify(t)}import f from"path";import{existsSync as O}from"fs";import{spawn as Y}from"child_process";var x=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),K=`http://127.0.0.1:${x}/health`;async function k(){try{return(await fetch(K,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function U(){try{if(await k())return!0;console.error("[claude-mem] Worker not responding, starting...");let d=v(),e=f.join(d,"plugin","scripts","worker-service.cjs");if(!O(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=f.join(d,"ecosystem.config.cjs"),t=f.join(d,"node_modules",".bin","pm2");if(!O(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!O(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=Y(t,["start",s],{detached:!0,stdio:"ignore",cwd:d});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(n=>setTimeout(n,500)),await k())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(d){return console.error(`[claude-mem] Failed to start worker: ${d.message}`),!1}}function w(){return x}async function q(d){if(!d)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=d,r=V.basename(s);if(!await U())throw new Error("Worker service failed to start or become healthy");let n=new R,i=n.createSDKSession(e,r,t),p=n.incrementPromptCounter(i);n.saveUserPrompt(e,p,t),console.error(`[new-hook] Session ${i}, prompt #${p}`),n.close();let _=w(),u=await fetch(`http://127.0.0.1:${_}/sessions/${i}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!u.ok){let l=await u.text();throw new Error(`Failed to initialize session: ${u.status} ${l}`)}console.log(D("UserPromptSubmit",!0))}var I="";M.on("data",d=>I+=d);M.on("end",async()=>{let d=I?JSON.parse(I):void 0;await q(d)});
|
||||
`;try{let l=this.db.prepare(u).all(d,p,...i),S=this.db.prepare(T).all(d,p,...i),_=this.db.prepare(b).all(d,p,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:_.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function X(c,e,s){return c==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:c==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:c==="UserPromptSubmit"||c==="PostToolUse"?{continue:!0,suppressOutput:!0}:c==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function C(c,e,s={}){let t=X(c,e,s);return JSON.stringify(t)}async function B(c){if(!c)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=c,r=F.basename(s),n=new R,o=n.createSDKSession(e,r,t),i=n.incrementPromptCounter(o);n.saveUserPrompt(e,i,t),console.error(`[new-hook] Session ${o}, prompt #${i}`),n.close();let d=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);try{let p=await fetch(`http://127.0.0.1:${d}/sessions/${o}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!p.ok){let u=await p.text();throw new Error(`Failed to initialize session: ${p.status} ${u}`)}}catch(p){throw p.cause?.code==="ECONNREFUSED"||p.name==="TimeoutError"||p.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):p}console.log(C("UserPromptSubmit",!0))}var O="";v.on("data",c=>O+=c);v.on("end",async()=>{let c=O?JSON.parse(O):void 0;await B(c)});
|
||||
|
||||
+17
-17
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as U}from"process";import $ from"better-sqlite3";import{join as u,dirname as X,basename as Q}from"path";import{homedir as C}from"os";import{existsSync as se,mkdirSync as F}from"fs";import{fileURLToPath as P}from"url";function H(){return typeof __dirname<"u"?__dirname:X(P(import.meta.url))}var B=H(),l=process.env.CLAUDE_MEM_DATA_DIR||u(C(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||u(C(),".claude"),re=u(l,"archives"),oe=u(l,"logs"),ne=u(l,"trash"),ie=u(l,"backups"),ae=u(l,"settings.json"),v=u(l,"claude-mem.db"),de=u(l,"vector-db"),pe=u(h,"settings.json"),ce=u(h,"commands"),_e=u(h,"CLAUDE.md");function y(d){F(d,{recursive:!0})}function D(){return u(B,"..","..")}var N=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(N||{}),f=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let n=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),p=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let E="";o!=null&&(this.level===0&&typeof o=="object"?E=`
|
||||
`+JSON.stringify(o,null,2):E=" "+this.formatData(o));let m="";if(r){let{sessionId:T,sdkSessionId:R,correlationId:c,...a}=r;Object.keys(a).length>0&&(m=` {${Object.entries(a).map(([w,M])=>`${w}=${M}`).join(", ")}}`)}let S=`[${n}] [${i}] [${p}] ${_}${t}${m}${E}`;e===3?console.error(S):console.log(S)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},b=new f;var g=class{db;constructor(){y(l),this.db=new $(v),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as v}from"process";import w from"better-sqlite3";import{join as u,dirname as k,basename as $}from"path";import{homedir as L}from"os";import{existsSync as Y,mkdirSync as x}from"fs";import{fileURLToPath as U}from"url";function M(){return typeof __dirname<"u"?__dirname:k(U(import.meta.url))}var V=M(),l=process.env.CLAUDE_MEM_DATA_DIR||u(L(),".claude-mem"),N=process.env.CLAUDE_CONFIG_DIR||u(L(),".claude"),q=u(l,"archives"),J=u(l,"logs"),Q=u(l,"trash"),z=u(l,"backups"),Z=u(l,"settings.json"),A=u(l,"claude-mem.db"),ee=u(l,"vector-db"),se=u(N,"settings.json"),te=u(N,"commands"),re=u(N,"CLAUDE.md");function C(c){x(c,{recursive:!0})}var h=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(h||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let n=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let p="";o!=null&&(this.level===0&&typeof o=="object"?p=`
|
||||
`+JSON.stringify(o,null,2):p=" "+this.formatData(o));let m="";if(r){let{sessionId:T,sdkSessionId:b,correlationId:_,...a}=r;Object.keys(a).length>0&&(m=` {${Object.entries(a).map(([y,D])=>`${y}=${D}`).join(", ")}}`)}let R=`[${n}] [${i}] [${d}] ${E}${t}${m}${p}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},S=new O;var g=class{db;constructor(){C(l),this.db=new w(A),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(p=>p.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(d=>d.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -232,7 +232,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,o=new Set;for(let n of t){if(n.files_read)try{let i=JSON.parse(n.files_read);Array.isArray(i)&&i.forEach(p=>r.add(p))}catch{}if(n.files_modified)try{let i=JSON.parse(n.files_modified);Array.isArray(i)&&i.forEach(p=>o.add(p))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,o=new Set;for(let n of t){if(n.files_read)try{let i=JSON.parse(n.files_read);Array.isArray(i)&&i.forEach(d=>r.add(d))}catch{}if(n.files_modified)try{let i=JSON.parse(n.files_modified);Array.isArray(i)&&i.forEach(d=>o.add(d))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -269,7 +269,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET sdk_session_id = ?
|
||||
WHERE id = ? AND sdk_session_id IS NULL
|
||||
`).run(s,e).changes===0?(b.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
`).run(s,e).changes===0?(S.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:s}),!1):!0}setWorkerPort(e,s){this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET worker_port = ?
|
||||
WHERE id = ?
|
||||
@@ -288,23 +288,23 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let p=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),n);return{id:Number(E.lastInsertRowid),createdAtEpoch:n}}storeSummary(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),n);return{id:Number(p.lastInsertRowid),createdAtEpoch:n}}storeSummary(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let p=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),n);return{id:Number(E.lastInsertRowid),createdAtEpoch:n}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),n);return{id:Number(p.lastInsertRowid),createdAtEpoch:n}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -331,31 +331,31 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${o}
|
||||
${n}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let n=o?"AND project = ?":"",i=o?[o]:[],p,_;if(e!==null){let T=`
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let n=o?"AND project = ?":"",i=o?[o]:[],d,E;if(e!==null){let T=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${n}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,R=`
|
||||
`,b=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${n}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(T).all(e,...i,t+1),a=this.db.prepare(R).all(e,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let T=`
|
||||
`;try{let _=this.db.prepare(T).all(e,...i,t+1),a=this.db.prepare(b).all(e,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,E=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary observations:",_.message),{observations:[],sessions:[],prompts:[]}}}else{let T=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${n}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,R=`
|
||||
`,b=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${n}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(T).all(s,...i,t),a=this.db.prepare(R).all(s,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let E=`
|
||||
`;try{let _=this.db.prepare(T).all(s,...i,t),a=this.db.prepare(b).all(s,...i,r+1);if(_.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=_.length>0?_[_.length-1].created_at_epoch:s,E=a.length>0?a[a.length-1].created_at_epoch:s}catch(_){return console.error("[SessionStore] Error getting boundary timestamps:",_.message),{observations:[],sessions:[],prompts:[]}}}let p=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
@@ -365,10 +365,10 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,S=`
|
||||
`,R=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${n.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let T=this.db.prepare(E).all(p,_,...i),R=this.db.prepare(m).all(p,_,...i),c=this.db.prepare(S).all(p,_,...i);return{observations:T,sessions:R.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:c.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(T){return console.error("[SessionStore] Error querying timeline records:",T.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function W(d,e,s){return d==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:d==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:d==="UserPromptSubmit"||d==="PostToolUse"?{continue:!0,suppressOutput:!0}:d==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function O(d,e,s={}){let t=W(d,e,s);return JSON.stringify(t)}import I from"path";import{existsSync as L}from"fs";import{spawn as G}from"child_process";var j=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),Y=`http://127.0.0.1:${j}/health`;async function k(){try{return(await fetch(Y,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function x(){try{if(await k())return!0;console.error("[claude-mem] Worker not responding, starting...");let d=D(),e=I.join(d,"plugin","scripts","worker-service.cjs");if(!L(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=I.join(d,"ecosystem.config.cjs"),t=I.join(d,"node_modules",".bin","pm2");if(!L(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!L(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:d});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(n=>setTimeout(n,500)),await k())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(d){return console.error(`[claude-mem] Failed to start worker: ${d.message}`),!1}}var K=new Set(["ListMcpResourcesTool"]);async function V(d){if(!d)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=d;if(K.has(s)){console.log(O("PostToolUse",!0));return}if(!await x())throw new Error("Worker service failed to start or become healthy");let n=new g,i=n.createSDKSession(e,"",""),p=n.getPromptCounter(i);n.close();let _=b.formatTool(s,t),E=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);b.dataIn("HOOK",`PostToolUse: ${_}`,{sessionId:i,workerPort:E});let m=await fetch(`http://127.0.0.1:${E}/sessions/${i}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:p}),signal:AbortSignal.timeout(2e3)});if(!m.ok){let S=await m.text();throw b.failure("HOOK","Failed to send observation",{sessionId:i,status:m.status},S),new Error(`Failed to send observation to worker: ${m.status} ${S}`)}b.debug("HOOK","Observation sent successfully",{sessionId:i,toolName:s}),console.log(O("PostToolUse",!0))}var A="";U.on("data",d=>A+=d);U.on("end",async()=>{let d=A?JSON.parse(A):void 0;await V(d)});
|
||||
`;try{let T=this.db.prepare(p).all(d,E,...i),b=this.db.prepare(m).all(d,E,...i),_=this.db.prepare(R).all(d,E,...i);return{observations:T,sessions:b.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:_.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(T){return console.error("[SessionStore] Error querying timeline records:",T.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function X(c,e,s){return c==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:c==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:c==="UserPromptSubmit"||c==="PostToolUse"?{continue:!0,suppressOutput:!0}:c==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function I(c,e,s={}){let t=X(c,e,s);return JSON.stringify(t)}var F=new Set(["ListMcpResourcesTool"]);async function H(c){if(!c)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=c;if(F.has(s)){console.log(I("PostToolUse",!0));return}let o=new g,n=o.createSDKSession(e,"",""),i=o.getPromptCounter(n);o.close();let d=S.formatTool(s,t),E=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);S.dataIn("HOOK",`PostToolUse: ${d}`,{sessionId:n,workerPort:E});try{let p=await fetch(`http://127.0.0.1:${E}/sessions/${n}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:i}),signal:AbortSignal.timeout(2e3)});if(!p.ok){let m=await p.text();throw S.failure("HOOK","Failed to send observation",{sessionId:n,status:p.status},m),new Error(`Failed to send observation to worker: ${p.status} ${m}`)}S.debug("HOOK","Observation sent successfully",{sessionId:n,toolName:s})}catch(p){throw p.cause?.code==="ECONNREFUSED"||p.name==="TimeoutError"||p.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):p}console.log(I("PostToolUse",!0))}var f="";v.on("data",c=>f+=c);v.on("end",async()=>{let c=f?JSON.parse(f):void 0;await H(c)});
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as U}from"process";import W from"better-sqlite3";import{join as _,dirname as X,basename as J}from"path";import{homedir as A}from"os";import{existsSync as ee,mkdirSync as F}from"fs";import{fileURLToPath as P}from"url";function H(){return typeof __dirname<"u"?__dirname:X(P(import.meta.url))}var B=H(),m=process.env.CLAUDE_MEM_DATA_DIR||_(A(),".claude-mem"),h=process.env.CLAUDE_CONFIG_DIR||_(A(),".claude"),te=_(m,"archives"),re=_(m,"logs"),oe=_(m,"trash"),ne=_(m,"backups"),ie=_(m,"settings.json"),C=_(m,"claude-mem.db"),ae=_(m,"vector-db"),de=_(h,"settings.json"),pe=_(h,"commands"),ce=_(h,"CLAUDE.md");function v(d){F(d,{recursive:!0})}function y(){return _(B,"..","..")}var N=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(N||{}),f=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=N[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let n=new Date().toISOString().replace("T"," ").substring(0,23),i=N[e].padEnd(5),p=s.padEnd(6),u="";r?.correlationId?u=`[${r.correlationId}] `:r?.sessionId&&(u=`[session-${r.sessionId}] `);let E="";o!=null&&(this.level===0&&typeof o=="object"?E=`
|
||||
`+JSON.stringify(o,null,2):E=" "+this.formatData(o));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:c,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([w,M])=>`${w}=${M}`).join(", ")}}`)}let R=`[${n}] [${i}] [${p}] ${u}${t}${T}${E}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},b=new f;var g=class{db;constructor(){v(m),this.db=new W(C),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as v}from"process";import w from"better-sqlite3";import{join as E,dirname as k,basename as P}from"path";import{homedir as f}from"os";import{existsSync as j,mkdirSync as x}from"fs";import{fileURLToPath as U}from"url";function M(){return typeof __dirname<"u"?__dirname:k(U(import.meta.url))}var K=M(),m=process.env.CLAUDE_MEM_DATA_DIR||E(f(),".claude-mem"),N=process.env.CLAUDE_CONFIG_DIR||E(f(),".claude"),V=E(m,"archives"),q=E(m,"logs"),J=E(m,"trash"),Q=E(m,"backups"),z=E(m,"settings.json"),L=E(m,"claude-mem.db"),Z=E(m,"vector-db"),ee=E(N,"settings.json"),se=E(N,"commands"),te=E(N,"CLAUDE.md");function A(p){x(p,{recursive:!0})}var h=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(h||{}),O=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=h[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let o=new Date().toISOString().replace("T"," ").substring(0,23),i=h[e].padEnd(5),d=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let u="";n!=null&&(this.level===0&&typeof n=="object"?u=`
|
||||
`+JSON.stringify(n,null,2):u=" "+this.formatData(n));let T="";if(r){let{sessionId:l,sdkSessionId:S,correlationId:c,...a}=r;Object.keys(a).length>0&&(T=` {${Object.entries(a).map(([y,D])=>`${y}=${D}`).join(", ")}}`)}let R=`[${o}] [${i}] [${d}] ${_}${t}${T}${u}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},b=new O;var g=class{db;constructor(){A(m),this.db=new w(L),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -63,7 +63,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(p=>p.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(p=>p.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(r=>r.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(d=>d.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(r=>r.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sdk_session_id TEXT NOT NULL,
|
||||
@@ -214,12 +214,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).get(e)||null}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${n}
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
@@ -232,7 +232,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,o=new Set;for(let n of t){if(n.files_read)try{let i=JSON.parse(n.files_read);Array.isArray(i)&&i.forEach(p=>r.add(p))}catch{}if(n.files_modified)try{let i=JSON.parse(n.files_modified);Array.isArray(i)&&i.forEach(p=>o.add(p))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,n=new Set;for(let o of t){if(o.files_read)try{let i=JSON.parse(o.files_read);Array.isArray(i)&&i.forEach(d=>r.add(d))}catch{}if(o.files_modified)try{let i=JSON.parse(o.files_modified);Array.isArray(i)&&i.forEach(d=>n.add(d))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(n)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -259,11 +259,11 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,o=r.getTime(),i=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,s,t){let r=new Date,n=r.getTime(),i=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,t,r.toISOString(),o);return i.lastInsertRowid===0||i.changes===0?this.db.prepare(`
|
||||
`).run(e,e,s,t,r.toISOString(),n);return i.lastInsertRowid===0||i.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:i.lastInsertRowid}updateSDKSessionId(e,s){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
@@ -278,33 +278,33 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,o=r.getTime();return this.db.prepare(`
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,s,t){let r=new Date,n=r.getTime();return this.db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`).run(e,s,t,r.toISOString(),o).lastInsertRowid}storeObservation(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t,r.toISOString(),n).lastInsertRowid}storeObservation(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,o.toISOString(),n);return{id:Number(E.lastInsertRowid),createdAtEpoch:n}}storeSummary(e,s,t,r){let o=new Date,n=o.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.type,t.title,t.subtitle,JSON.stringify(t.facts),t.narrative,JSON.stringify(t.concepts),JSON.stringify(t.files_read),JSON.stringify(t.files_modified),r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}storeSummary(e,s,t,r){let n=new Date,o=n.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,s,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let E=this.db.prepare(`
|
||||
`).run(e,e,s,n.toISOString(),o),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`));let u=this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,o.toISOString(),n);return{id:Number(E.lastInsertRowid),createdAtEpoch:n}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
`).run(e,s,t.request,t.investigated,t.learned,t.completed,t.next_steps,t.notes,r||null,n.toISOString(),o);return{id:Number(u.lastInsertRowid),createdAtEpoch:o}}markSessionCompleted(e){let s=new Date,t=s.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -316,12 +316,12 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
`).run(e.toISOString(),s).changes}getSessionSummariesByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
WHERE id IN (${i})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${n}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",n=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getUserPromptsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",o=r?`LIMIT ${r}`:"",i=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
@@ -329,46 +329,46 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.id IN (${i})
|
||||
ORDER BY up.created_at_epoch ${o}
|
||||
${n}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,o){let n=o?"AND project = ?":"",i=o?[o]:[],p,u;if(e!==null){let l=`
|
||||
ORDER BY up.created_at_epoch ${n}
|
||||
${o}
|
||||
`).all(...e)}getTimelineAroundTimestamp(e,s=10,t=10,r){return this.getTimelineAroundObservation(null,e,s,t,r)}getTimelineAroundObservation(e,s,t=10,r=10,n){let o=n?"AND project = ?":"",i=n?[n]:[],d,_;if(e!==null){let l=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${n}
|
||||
WHERE id <= ? ${o}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`,S=`
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${n}
|
||||
WHERE id >= ? ${o}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,u=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
`;try{let c=this.db.prepare(l).all(e,...i,t+1),a=this.db.prepare(S).all(e,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary observations:",c.message),{observations:[],sessions:[],prompts:[]}}}else{let l=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`,S=`
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${n}
|
||||
WHERE created_at_epoch >= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;try{let c=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};p=c.length>0?c[c.length-1].created_at_epoch:s,u=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let E=`
|
||||
`;try{let c=this.db.prepare(l).all(s,...i,t),a=this.db.prepare(S).all(s,...i,r+1);if(c.length===0&&a.length===0)return{observations:[],sessions:[],prompts:[]};d=c.length>0?c[c.length-1].created_at_epoch:s,_=a.length>0?a[a.length-1].created_at_epoch:s}catch(c){return console.error("[SessionStore] Error getting boundary timestamps:",c.message),{observations:[],sessions:[],prompts:[]}}}let u=`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,T=`
|
||||
SELECT *
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${n}
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${o}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`,R=`
|
||||
SELECT up.*, s.project, s.sdk_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${n.replace("project","s.project")}
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${o.replace("project","s.project")}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;try{let l=this.db.prepare(E).all(p,u,...i),S=this.db.prepare(T).all(p,u,...i),c=this.db.prepare(R).all(p,u,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:c.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function $(d,e,s){return d==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:d==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:d==="UserPromptSubmit"||d==="PostToolUse"?{continue:!0,suppressOutput:!0}:d==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function D(d,e,s={}){let t=$(d,e,s);return JSON.stringify(t)}import O from"path";import{existsSync as I}from"fs";import{spawn as G}from"child_process";var j=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),Y=`http://127.0.0.1:${j}/health`;async function k(){try{return(await fetch(Y,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function x(){try{if(await k())return!0;console.error("[claude-mem] Worker not responding, starting...");let d=y(),e=O.join(d,"plugin","scripts","worker-service.cjs");if(!I(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=O.join(d,"ecosystem.config.cjs"),t=O.join(d,"node_modules",".bin","pm2");if(!I(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!I(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:d});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(n=>setTimeout(n,500)),await k())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(d){return console.error(`[claude-mem] Failed to start worker: ${d.message}`),!1}}async function K(d){if(!d)throw new Error("summaryHook requires input");let{session_id:e}=d;if(!await x())throw new Error("Worker service failed to start or become healthy");let t=new g,r=t.createSDKSession(e,"",""),o=t.getPromptCounter(r);t.close();let n=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);b.dataIn("HOOK","Stop: Requesting summary",{sessionId:r,workerPort:n,promptNumber:o});let i=await fetch(`http://127.0.0.1:${n}/sessions/${r}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:o}),signal:AbortSignal.timeout(2e3)});if(!i.ok){let p=await i.text();throw b.failure("HOOK","Failed to generate summary",{sessionId:r,status:i.status},p),new Error(`Failed to request summary from worker: ${i.status} ${p}`)}b.debug("HOOK","Summary request sent successfully",{sessionId:r}),console.log(D("Stop",!0))}var L="";U.on("data",d=>L+=d);U.on("end",async()=>{let d=L?JSON.parse(L):void 0;await K(d)});
|
||||
`;try{let l=this.db.prepare(u).all(d,_,...i),S=this.db.prepare(T).all(d,_,...i),c=this.db.prepare(R).all(d,_,...i);return{observations:l,sessions:S.map(a=>({id:a.id,sdk_session_id:a.sdk_session_id,project:a.project,request:a.request,completed:a.completed,next_steps:a.next_steps,created_at:a.created_at,created_at_epoch:a.created_at_epoch})),prompts:c.map(a=>({id:a.id,claude_session_id:a.claude_session_id,project:a.project,prompt:a.prompt_text,created_at:a.created_at,created_at_epoch:a.created_at_epoch}))}}catch(l){return console.error("[SessionStore] Error querying timeline records:",l.message),{observations:[],sessions:[],prompts:[]}}}close(){this.db.close()}};function X(p,e,s){return p==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:p==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:p==="UserPromptSubmit"||p==="PostToolUse"?{continue:!0,suppressOutput:!0}:p==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function C(p,e,s={}){let t=X(p,e,s);return JSON.stringify(t)}async function F(p){if(!p)throw new Error("summaryHook requires input");let{session_id:e}=p,s=new g,t=s.createSDKSession(e,"",""),r=s.getPromptCounter(t);s.close();let n=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);b.dataIn("HOOK","Stop: Requesting summary",{sessionId:t,workerPort:n,promptNumber:r});try{let o=await fetch(`http://127.0.0.1:${n}/sessions/${t}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:r}),signal:AbortSignal.timeout(2e3)});if(!o.ok){let i=await o.text();throw b.failure("HOOK","Failed to generate summary",{sessionId:t,status:o.status},i),new Error(`Failed to request summary from worker: ${o.status} ${i}`)}b.debug("HOOK","Summary request sent successfully",{sessionId:t})}catch(o){throw o.cause?.code==="ECONNREFUSED"||o.name==="TimeoutError"||o.message.includes("fetch failed")?new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue"):o}console.log(C("Stop",!0))}var I="";v.on("data",p=>I+=p);v.on("end",async()=>{let p=I?JSON.parse(I):void 0;await F(p)});
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -5,7 +5,6 @@
|
||||
|
||||
import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
|
||||
export interface SessionEndInput {
|
||||
session_id: string;
|
||||
@@ -45,12 +44,6 @@ async function cleanupHook(input?: SessionEndInput): Promise<void> {
|
||||
const { session_id, reason } = input;
|
||||
console.error('[claude-mem cleanup] Searching for active SDK session', { session_id, reason });
|
||||
|
||||
// Ensure worker is running first
|
||||
const workerReady = await ensureWorkerRunning();
|
||||
if (!workerReady) {
|
||||
console.error('[claude-mem cleanup] Worker not available - skipping HTTP cleanup');
|
||||
}
|
||||
|
||||
// Find active SDK session
|
||||
const db = new SessionStore();
|
||||
const session = db.findActiveSDKSession(session_id);
|
||||
|
||||
@@ -6,7 +6,6 @@
|
||||
import path from 'path';
|
||||
import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
|
||||
// Configuration: Read from environment or use defaults
|
||||
const DISPLAY_OBSERVATION_COUNT = parseInt(process.env.CLAUDE_MEM_CONTEXT_OBSERVATIONS || '50', 10);
|
||||
@@ -127,7 +126,6 @@ function getObservations(db: SessionStore, sessionIds: string[]): Observation[]
|
||||
* Context Hook Main Logic
|
||||
*/
|
||||
function contextHook(input?: SessionStartInput, useColors: boolean = false, useIndexView: boolean = false): string {
|
||||
ensureWorkerRunning();
|
||||
const cwd = input?.cwd ?? process.cwd();
|
||||
const project = cwd ? path.basename(cwd) : 'unknown-project';
|
||||
|
||||
|
||||
+21
-19
@@ -7,7 +7,6 @@ import path from 'path';
|
||||
import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { ensureWorkerRunning, getWorkerPort } from '../shared/worker-utils.js';
|
||||
|
||||
export interface UserPromptSubmitInput {
|
||||
session_id: string;
|
||||
@@ -27,12 +26,6 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
const { session_id, cwd, prompt } = input;
|
||||
const project = path.basename(cwd);
|
||||
|
||||
// Ensure worker is running first
|
||||
const workerReady = await ensureWorkerRunning();
|
||||
if (!workerReady) {
|
||||
throw new Error('Worker service failed to start or become healthy');
|
||||
}
|
||||
|
||||
const db = new SessionStore();
|
||||
|
||||
// Save session_id for indexing
|
||||
@@ -46,20 +39,29 @@ async function newHook(input?: UserPromptSubmitInput): Promise<void> {
|
||||
|
||||
db.close();
|
||||
|
||||
// Get fixed port
|
||||
const port = getWorkerPort();
|
||||
// Use fixed worker port
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
|
||||
// Initialize session via HTTP
|
||||
const response = await fetch(`http://127.0.0.1:${port}/sessions/${sessionDbId}/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ project, userPrompt: prompt }),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
try {
|
||||
// Initialize session via HTTP
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/init`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ project, userPrompt: prompt }),
|
||||
signal: AbortSignal.timeout(5000)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(`Failed to initialize session: ${response.status} ${errorText}`);
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(`Failed to initialize session: ${response.status} ${errorText}`);
|
||||
}
|
||||
} catch (error: any) {
|
||||
// Only show restart message for connection errors, not HTTP errors
|
||||
if (error.cause?.code === 'ECONNREFUSED' || error.name === 'TimeoutError' || error.message.includes('fetch failed')) {
|
||||
throw new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue");
|
||||
}
|
||||
// Re-throw HTTP errors and other errors as-is
|
||||
throw error;
|
||||
}
|
||||
|
||||
console.log(createHookResponse('UserPromptSubmit', true));
|
||||
|
||||
+29
-26
@@ -7,7 +7,6 @@ import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
|
||||
export interface PostToolUseInput {
|
||||
session_id: string;
|
||||
@@ -38,12 +37,6 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
return;
|
||||
}
|
||||
|
||||
// Ensure worker is running first
|
||||
const workerReady = await ensureWorkerRunning();
|
||||
if (!workerReady) {
|
||||
throw new Error('Worker service failed to start or become healthy');
|
||||
}
|
||||
|
||||
const db = new SessionStore();
|
||||
|
||||
// Get or create session
|
||||
@@ -61,28 +54,38 @@ async function saveHook(input?: PostToolUseInput): Promise<void> {
|
||||
workerPort: FIXED_PORT
|
||||
});
|
||||
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/observations`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
tool_name,
|
||||
tool_input: tool_input !== undefined ? JSON.stringify(tool_input) : '{}',
|
||||
tool_output: tool_output !== undefined ? JSON.stringify(tool_output) : '{}',
|
||||
prompt_number: promptNumber
|
||||
}),
|
||||
signal: AbortSignal.timeout(2000)
|
||||
});
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/observations`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
tool_name,
|
||||
tool_input: tool_input !== undefined ? JSON.stringify(tool_input) : '{}',
|
||||
tool_output: tool_output !== undefined ? JSON.stringify(tool_output) : '{}',
|
||||
prompt_number: promptNumber
|
||||
}),
|
||||
signal: AbortSignal.timeout(2000)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
logger.failure('HOOK', 'Failed to send observation', {
|
||||
sessionId: sessionDbId,
|
||||
status: response.status
|
||||
}, errorText);
|
||||
throw new Error(`Failed to send observation to worker: ${response.status} ${errorText}`);
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
logger.failure('HOOK', 'Failed to send observation', {
|
||||
sessionId: sessionDbId,
|
||||
status: response.status
|
||||
}, errorText);
|
||||
throw new Error(`Failed to send observation to worker: ${response.status} ${errorText}`);
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Observation sent successfully', { sessionId: sessionDbId, toolName: tool_name });
|
||||
} catch (error: any) {
|
||||
// Only show restart message for connection errors, not HTTP errors
|
||||
if (error.cause?.code === 'ECONNREFUSED' || error.name === 'TimeoutError' || error.message.includes('fetch failed')) {
|
||||
throw new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue");
|
||||
}
|
||||
// Re-throw HTTP errors and other errors as-is
|
||||
throw error;
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Observation sent successfully', { sessionId: sessionDbId, toolName: tool_name });
|
||||
console.log(createHookResponse('PostToolUse', true));
|
||||
}
|
||||
|
||||
|
||||
+24
-21
@@ -7,7 +7,6 @@ import { stdin } from 'process';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { createHookResponse } from './hook-response.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { ensureWorkerRunning } from '../shared/worker-utils.js';
|
||||
|
||||
export interface StopInput {
|
||||
session_id: string;
|
||||
@@ -25,12 +24,6 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
|
||||
const { session_id } = input;
|
||||
|
||||
// Ensure worker is running first
|
||||
const workerReady = await ensureWorkerRunning();
|
||||
if (!workerReady) {
|
||||
throw new Error('Worker service failed to start or become healthy');
|
||||
}
|
||||
|
||||
const db = new SessionStore();
|
||||
|
||||
// Get or create session
|
||||
@@ -47,23 +40,33 @@ async function summaryHook(input?: StopInput): Promise<void> {
|
||||
promptNumber
|
||||
});
|
||||
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/summarize`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ prompt_number: promptNumber }),
|
||||
signal: AbortSignal.timeout(2000)
|
||||
});
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${FIXED_PORT}/sessions/${sessionDbId}/summarize`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ prompt_number: promptNumber }),
|
||||
signal: AbortSignal.timeout(2000)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
logger.failure('HOOK', 'Failed to generate summary', {
|
||||
sessionId: sessionDbId,
|
||||
status: response.status
|
||||
}, errorText);
|
||||
throw new Error(`Failed to request summary from worker: ${response.status} ${errorText}`);
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
logger.failure('HOOK', 'Failed to generate summary', {
|
||||
sessionId: sessionDbId,
|
||||
status: response.status
|
||||
}, errorText);
|
||||
throw new Error(`Failed to request summary from worker: ${response.status} ${errorText}`);
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Summary request sent successfully', { sessionId: sessionDbId });
|
||||
} catch (error: any) {
|
||||
// Only show restart message for connection errors, not HTTP errors
|
||||
if (error.cause?.code === 'ECONNREFUSED' || error.name === 'TimeoutError' || error.message.includes('fetch failed')) {
|
||||
throw new Error("There's a problem with the worker. If you just updated, type `pm2 restart claude-mem-worker` in your terminal to continue");
|
||||
}
|
||||
// Re-throw HTTP errors and other errors as-is
|
||||
throw error;
|
||||
}
|
||||
|
||||
logger.debug('HOOK', 'Summary request sent successfully', { sessionId: sessionDbId });
|
||||
console.log(createHookResponse('Stop', true));
|
||||
}
|
||||
|
||||
|
||||
@@ -19,14 +19,25 @@ const MODEL = process.env.CLAUDE_MEM_MODEL || 'claude-sonnet-4-5';
|
||||
const DISALLOWED_TOOLS = ['Glob', 'Grep', 'ListMcpResourcesTool', 'WebSearch'];
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
|
||||
/**
|
||||
* Cached Claude executable path
|
||||
*/
|
||||
let cachedClaudePath: string | null = null;
|
||||
|
||||
/**
|
||||
* Find Claude Code executable path using which (Unix/Mac) or where (Windows)
|
||||
* Cached after first call
|
||||
*/
|
||||
function findClaudePath(): string {
|
||||
if (cachedClaudePath) {
|
||||
return cachedClaudePath;
|
||||
}
|
||||
|
||||
try {
|
||||
// Try environment variable first
|
||||
if (process.env.CLAUDE_CODE_PATH) {
|
||||
return process.env.CLAUDE_CODE_PATH;
|
||||
cachedClaudePath = process.env.CLAUDE_CODE_PATH;
|
||||
return cachedClaudePath;
|
||||
}
|
||||
|
||||
// Use which on Unix/Mac, where on Windows
|
||||
@@ -41,7 +52,8 @@ function findClaudePath(): string {
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', `Found Claude executable: ${path}`);
|
||||
return path;
|
||||
cachedClaudePath = path;
|
||||
return cachedClaudePath;
|
||||
} catch (error: any) {
|
||||
logger.failure('SYSTEM', 'Failed to find Claude executable', {}, error);
|
||||
throw new Error('Claude Code executable not found. Please ensure claude is in your PATH or set CLAUDE_CODE_PATH environment variable.');
|
||||
@@ -76,24 +88,19 @@ interface ActiveSession {
|
||||
abortController: AbortController;
|
||||
generatorPromise: Promise<void> | null;
|
||||
lastPromptNumber: number; // Track which prompt_number we last sent to SDK
|
||||
observationCounter: number; // Counter for correlation IDs
|
||||
startTime: number; // Session start timestamp
|
||||
}
|
||||
|
||||
class WorkerService {
|
||||
private app: express.Application;
|
||||
private port: number | null = null;
|
||||
private port: number = FIXED_PORT;
|
||||
private sessions: Map<number, ActiveSession> = new Map();
|
||||
private chromaSync: ChromaSync;
|
||||
private chromaSync!: ChromaSync;
|
||||
|
||||
constructor() {
|
||||
this.app = express();
|
||||
this.app.use(express.json({ limit: '50mb' }));
|
||||
|
||||
// Initialize ChromaSync (fail fast if Chroma unavailable)
|
||||
this.chromaSync = new ChromaSync('claude-mem');
|
||||
logger.info('SYSTEM', 'ChromaSync initialized');
|
||||
|
||||
// Health check
|
||||
this.app.get('/health', this.handleHealth.bind(this));
|
||||
|
||||
@@ -106,7 +113,17 @@ class WorkerService {
|
||||
}
|
||||
|
||||
async start(): Promise<void> {
|
||||
this.port = FIXED_PORT;
|
||||
// Start HTTP server FIRST - nothing else matters until we can respond
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
this.app.listen(FIXED_PORT, () => resolve())
|
||||
.on('error', reject);
|
||||
});
|
||||
|
||||
logger.info('SYSTEM', 'Worker started', { port: FIXED_PORT, pid: process.pid });
|
||||
|
||||
// Initialize ChromaSync after HTTP is ready
|
||||
this.chromaSync = new ChromaSync('claude-mem');
|
||||
logger.info('SYSTEM', 'ChromaSync initialized');
|
||||
|
||||
// Clean up orphaned sessions from previous worker instances
|
||||
const db = new SessionStore();
|
||||
@@ -117,41 +134,23 @@ class WorkerService {
|
||||
logger.info('SYSTEM', `Cleaned up ${cleanedCount} orphaned sessions`);
|
||||
}
|
||||
|
||||
// Backfill Chroma with any missing observations/summaries (blocking)
|
||||
logger.info('SYSTEM', 'Starting Chroma backfill...');
|
||||
try {
|
||||
await this.chromaSync.ensureBackfilled();
|
||||
logger.info('SYSTEM', 'Chroma backfill complete');
|
||||
} catch (error) {
|
||||
logger.error('SYSTEM', 'Chroma backfill failed - worker cannot start', {}, error as Error);
|
||||
throw error;
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
this.app.listen(FIXED_PORT, '127.0.0.1', () => {
|
||||
logger.info('SYSTEM', `Worker started`, { port: FIXED_PORT, pid: process.pid, activeSessions: this.sessions.size });
|
||||
resolve();
|
||||
}).on('error', (err: any) => {
|
||||
if (err.code === 'EADDRINUSE') {
|
||||
logger.error('SYSTEM', `Port ${FIXED_PORT} already in use - worker may already be running`);
|
||||
}
|
||||
reject(err);
|
||||
// Backfill Chroma in background (non-blocking, non-critical)
|
||||
logger.info('SYSTEM', 'Starting Chroma backfill in background...');
|
||||
this.chromaSync.ensureBackfilled()
|
||||
.then(() => {
|
||||
logger.info('SYSTEM', 'Chroma backfill complete');
|
||||
})
|
||||
.catch((error: Error) => {
|
||||
logger.error('SYSTEM', 'Chroma backfill failed - continuing anyway', {}, error);
|
||||
// Don't exit - allow worker to continue serving requests
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /health
|
||||
*/
|
||||
private handleHealth(req: Request, res: Response): void {
|
||||
res.json({
|
||||
status: 'ok',
|
||||
port: this.port,
|
||||
pid: process.pid,
|
||||
activeSessions: this.sessions.size,
|
||||
uptime: process.uptime(),
|
||||
memory: process.memoryUsage()
|
||||
});
|
||||
private handleHealth(_req: Request, res: Response): void {
|
||||
res.json({ status: 'ok' });
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -162,8 +161,7 @@ class WorkerService {
|
||||
const sessionDbId = parseInt(req.params.sessionDbId, 10);
|
||||
const { project, userPrompt } = req.body;
|
||||
|
||||
const correlationId = logger.sessionId(sessionDbId);
|
||||
logger.info('WORKER', 'Session init', { correlationId, project });
|
||||
logger.info('WORKER', 'Session init', { sessionDbId, project });
|
||||
|
||||
// Fetch real Claude Code session ID from database
|
||||
const db = new SessionStore();
|
||||
@@ -187,7 +185,6 @@ class WorkerService {
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
observationCounter: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
|
||||
@@ -221,8 +218,8 @@ class WorkerService {
|
||||
latestPrompt.prompt_number,
|
||||
latestPrompt.created_at_epoch
|
||||
).catch(err => {
|
||||
logger.failure('WORKER', 'Failed to sync user_prompt to Chroma', { promptId: latestPrompt.id }, err);
|
||||
process.exit(1); // Fail fast - Chroma sync is critical
|
||||
logger.failure('WORKER', 'Failed to sync user_prompt to Chroma - continuing', { promptId: latestPrompt.id }, err);
|
||||
// Don't crash - SQLite has the data
|
||||
});
|
||||
}
|
||||
|
||||
@@ -268,7 +265,6 @@ class WorkerService {
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
observationCounter: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
this.sessions.set(sessionDbId, session);
|
||||
@@ -283,13 +279,10 @@ class WorkerService {
|
||||
});
|
||||
}
|
||||
|
||||
// Create correlation ID for tracking this observation
|
||||
session.observationCounter++;
|
||||
const correlationId = logger.correlationId(sessionDbId, session.observationCounter);
|
||||
const toolStr = logger.formatTool(tool_name, tool_input);
|
||||
|
||||
logger.dataIn('WORKER', `Observation queued: ${toolStr}`, {
|
||||
correlationId,
|
||||
sessionId: sessionDbId,
|
||||
queue: session.pendingMessages.length + 1
|
||||
});
|
||||
|
||||
@@ -329,7 +322,6 @@ class WorkerService {
|
||||
abortController: new AbortController(),
|
||||
generatorPromise: null,
|
||||
lastPromptNumber: 0,
|
||||
observationCounter: 0,
|
||||
startTime: Date.now()
|
||||
};
|
||||
this.sessions.set(sessionDbId, session);
|
||||
@@ -559,14 +551,13 @@ class WorkerService {
|
||||
});
|
||||
|
||||
const toolStr = logger.formatTool(message.tool_name, message.tool_input);
|
||||
const correlationId = logger.correlationId(session.sessionDbId, session.observationCounter);
|
||||
|
||||
logger.dataIn('SDK', `Observation prompt: ${toolStr}`, {
|
||||
correlationId,
|
||||
sessionId: session.sessionDbId,
|
||||
promptNumber: message.prompt_number,
|
||||
size: `${observationPrompt.length} chars`
|
||||
});
|
||||
logger.debug('SDK', 'Full observation prompt', { correlationId }, observationPrompt);
|
||||
logger.debug('SDK', 'Full observation prompt', { sessionId: session.sessionDbId }, observationPrompt);
|
||||
|
||||
yield {
|
||||
type: 'user',
|
||||
@@ -587,8 +578,6 @@ class WorkerService {
|
||||
* Gets prompt_number from the message that triggered this response
|
||||
*/
|
||||
private handleAgentMessage(session: ActiveSession, content: string, promptNumber: number): void {
|
||||
const correlationId = logger.correlationId(session.sessionDbId, session.observationCounter);
|
||||
|
||||
// Always log what we received for debugging
|
||||
logger.info('PARSER', `Processing response (${content.length} chars)`, {
|
||||
sessionId: session.sessionDbId,
|
||||
@@ -597,11 +586,11 @@ class WorkerService {
|
||||
});
|
||||
|
||||
// Parse observations
|
||||
const observations = parseObservations(content, correlationId);
|
||||
const observations = parseObservations(content);
|
||||
|
||||
if (observations.length > 0) {
|
||||
logger.info('PARSER', `Parsed ${observations.length} observation(s)`, {
|
||||
correlationId,
|
||||
sessionId: session.sessionDbId,
|
||||
promptNumber,
|
||||
types: observations.map(o => o.type).join(', ')
|
||||
});
|
||||
@@ -613,7 +602,7 @@ class WorkerService {
|
||||
for (const obs of observations) {
|
||||
const { id, createdAtEpoch } = db.storeObservation(session.claudeSessionId, session.project, obs, promptNumber);
|
||||
logger.success('DB', 'Observation stored', {
|
||||
correlationId,
|
||||
sessionId: session.sessionDbId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
id
|
||||
@@ -628,16 +617,16 @@ class WorkerService {
|
||||
promptNumber,
|
||||
createdAtEpoch
|
||||
).then(() => {
|
||||
logger.success('CHROMA', 'Observation synced', {
|
||||
correlationId,
|
||||
logger.success('WORKER', 'Observation synced to Chroma', {
|
||||
sessionId: session.sessionDbId,
|
||||
observationId: id
|
||||
});
|
||||
}).catch((error: Error) => {
|
||||
logger.error('CHROMA', 'Observation sync failed - crashing worker', {
|
||||
correlationId,
|
||||
logger.error('WORKER', 'Observation sync failed - continuing', {
|
||||
sessionId: session.sessionDbId,
|
||||
observationId: id
|
||||
}, error);
|
||||
process.exit(1); // Fail fast - no fallbacks
|
||||
// Don't crash - SQLite has the data
|
||||
});
|
||||
}
|
||||
|
||||
@@ -667,16 +656,16 @@ class WorkerService {
|
||||
promptNumber,
|
||||
createdAtEpoch
|
||||
).then(() => {
|
||||
logger.success('CHROMA', 'Summary synced', {
|
||||
logger.success('WORKER', 'Summary synced to Chroma', {
|
||||
sessionId: session.sessionDbId,
|
||||
summaryId: id
|
||||
});
|
||||
}).catch((error: Error) => {
|
||||
logger.error('CHROMA', 'Summary sync failed - crashing worker', {
|
||||
logger.error('WORKER', 'Summary sync failed - continuing', {
|
||||
sessionId: session.sessionDbId,
|
||||
summaryId: id
|
||||
}, error);
|
||||
process.exit(1); // Fail fast - no fallbacks
|
||||
// Don't crash - SQLite has the data
|
||||
});
|
||||
} else {
|
||||
logger.warn('PARSER', 'NO SUMMARY TAGS FOUND in response', {
|
||||
|
||||
+11
-94
@@ -1,106 +1,23 @@
|
||||
import path from 'path';
|
||||
import { existsSync } from 'fs';
|
||||
import { spawn } from 'child_process';
|
||||
import { getPackageRoot } from './paths.js';
|
||||
|
||||
const FIXED_PORT = parseInt(process.env.CLAUDE_MEM_WORKER_PORT || '37777', 10);
|
||||
const HEALTH_CHECK_URL = `http://127.0.0.1:${FIXED_PORT}/health`;
|
||||
|
||||
/**
|
||||
* Check if worker is responding by hitting health endpoint
|
||||
* Ensure worker service is running
|
||||
* Just starts PM2 - no health checks, no retries, no delays
|
||||
* PM2 handles the rest (already running = no-op, failures = exit code)
|
||||
*/
|
||||
async function checkWorkerHealth(): Promise<boolean> {
|
||||
try {
|
||||
const response = await fetch(HEALTH_CHECK_URL, {
|
||||
signal: AbortSignal.timeout(500)
|
||||
});
|
||||
return response.ok;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
export function ensureWorkerRunning(): void {
|
||||
const packageRoot = getPackageRoot();
|
||||
const pm2Path = path.join(packageRoot, 'node_modules', '.bin', 'pm2');
|
||||
const ecosystemPath = path.join(packageRoot, 'ecosystem.config.cjs');
|
||||
|
||||
/**
|
||||
* Ensure worker service is running with retry logic
|
||||
* Auto-starts worker if not running (v4.0.0 feature)
|
||||
*
|
||||
* @returns true if worker is responding, false if failed to start
|
||||
*/
|
||||
export async function ensureWorkerRunning(): Promise<boolean> {
|
||||
try {
|
||||
// Check if worker is already responding
|
||||
if (await checkWorkerHealth()) {
|
||||
return true;
|
||||
}
|
||||
|
||||
console.error('[claude-mem] Worker not responding, starting...');
|
||||
|
||||
// Find worker service path
|
||||
const packageRoot = getPackageRoot();
|
||||
const workerPath = path.join(packageRoot, 'plugin', 'scripts', 'worker-service.cjs');
|
||||
|
||||
if (!existsSync(workerPath)) {
|
||||
console.error(`[claude-mem] Worker service not found at ${workerPath}`);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Start worker with PM2 (bundled dependency)
|
||||
const ecosystemPath = path.join(packageRoot, 'ecosystem.config.cjs');
|
||||
const pm2Path = path.join(packageRoot, 'node_modules', '.bin', 'pm2');
|
||||
|
||||
// Fail loudly if bundled pm2 is missing
|
||||
if (!existsSync(pm2Path)) {
|
||||
throw new Error(
|
||||
`PM2 binary not found at ${pm2Path}. ` +
|
||||
`This is a bundled dependency - try running: npm install`
|
||||
);
|
||||
}
|
||||
|
||||
if (!existsSync(ecosystemPath)) {
|
||||
throw new Error(
|
||||
`PM2 ecosystem config not found at ${ecosystemPath}. ` +
|
||||
`Plugin installation may be corrupted.`
|
||||
);
|
||||
}
|
||||
|
||||
// Spawn worker with PM2
|
||||
const proc = spawn(pm2Path, ['start', ecosystemPath], {
|
||||
detached: true,
|
||||
stdio: 'ignore',
|
||||
cwd: packageRoot
|
||||
});
|
||||
|
||||
// Fail loudly on spawn errors
|
||||
proc.on('error', (err) => {
|
||||
throw new Error(`Failed to spawn PM2: ${err.message}`);
|
||||
});
|
||||
|
||||
proc.unref();
|
||||
console.error('[claude-mem] Worker started with PM2');
|
||||
|
||||
// Wait for worker to become healthy (retry 3 times with 500ms delay)
|
||||
for (let i = 0; i < 3; i++) {
|
||||
await new Promise(resolve => setTimeout(resolve, 500));
|
||||
if (await checkWorkerHealth()) {
|
||||
console.error('[claude-mem] Worker is healthy');
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
console.error('[claude-mem] Worker failed to become healthy after startup');
|
||||
return false;
|
||||
|
||||
} catch (error: any) {
|
||||
console.error(`[claude-mem] Failed to start worker: ${error.message}`);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if worker is currently running
|
||||
*/
|
||||
export async function isWorkerRunning(): Promise<boolean> {
|
||||
return checkWorkerHealth();
|
||||
spawn(pm2Path, ['start', ecosystemPath], {
|
||||
cwd: packageRoot,
|
||||
stdio: 'inherit'
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
Reference in New Issue
Block a user