Implement hybrid search server with Chroma + SQLite

- Built search-server.mjs successfully (55KB)
- Configured with packages: 'external' to use node_modules dependencies
- MCP config points to ${CLAUDE_PLUGIN_ROOT}/scripts/search-server.mjs
- Ready for deployment to plugin directory

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Alex Newman
2025-10-31 23:35:44 -04:00
parent 309e8a7139
commit 9a9b00c6d8
7 changed files with 1090 additions and 627 deletions
+503
View File
@@ -0,0 +1,503 @@
# Hybrid Search Implementation Status
**Branch**: `feature/hybrid-search`
**Date**: 2025-10-31
**Status**: ⚠️ **PARTIALLY COMPLETE** - Needs completion and validation
---
## Executive Summary
The hybrid search feature combines semantic search (ChromaDB) with temporal filtering (SQLite) to provide better context retrieval for the claude-mem memory system. The experimental validation and initial implementation have been completed, but the production implementation is **incomplete** and requires additional work before merging to main.
### Quick Status
-**Experiment validated**: Chroma sync and search workflows work
- ⚠️ **Implementation incomplete**: search-server.ts partially updated
-**Auto-sync missing**: ChromaSync service not yet implemented
-**Testing incomplete**: MCP server not fully validated
-**Documentation pending**: CLAUDE.md and release notes not updated
---
## What Was Done
### 1. Experimental Validation (Commits: 867226c, 309e8a7)
**Files Added**:
- `experiment/chroma-sync-experiment.ts` - Manual sync tool (works ✅)
- `experiment/chroma-search-test.ts` - Search quality validator (works ✅)
- `experiment/README.md` - Experiment documentation
- `experiment/RESULTS.md` - Search quality comparison results
**Key Findings**:
- ✅ Chroma MCP connection works via `uvx chroma-mcp`
- ✅ Collection `cm__claude-mem` successfully created
- ✅ 1,390 observations synced → 8,279 vector documents
- ✅ Document format validated: `obs_{id}_{field}` with metadata
- ⚠️ Search quality results are **INCONCLUSIVE** (see Critical Issues below)
### 2. Planning Documents
**Files Created**:
- `FEATURE_PLAN_HYBRID_SEARCH.md` (486 lines) - Comprehensive 6-phase implementation plan
- `NEXT_SESSION_PROMPT.md` (193 lines) - Session continuation instructions
**Plan Structure**:
1. Phase 1: Clean Start ✅ (completed)
2. Phase 2: Architecture Review ✅ (documented)
3. Phase 3: Implementation ⚠️ (partially complete)
4. Phase 4: Validation ❌ (not started)
5. Phase 5: Documentation ❌ (not started)
6. Phase 6: Deployment ❌ (not started)
### 3. Production Code Changes
#### src/servers/search-server.ts (319 lines added)
**What Works**:
- ✅ Chroma MCP client imports added
-`queryChroma()` helper function implemented (95 lines)
- Handles Python dict parsing with regex
- Extracts IDs from document format `obs_{id}_{field}`
- Parses distances and metadata correctly
-`search_observations` handler updated with hybrid workflow
- Chroma semantic search (top 100)
- 90-day temporal filter
- SQLite hydration in temporal order
- FTS5 fallback if Chroma fails
- ⚠️ `find_by_concept` handler **partially** updated
- Metadata-first filtering via SQLite
- Semantic ranking via Chroma
- **INCOMPLETE**: Implementation cut off mid-function (line 554 in diff)
**What's Missing**:
- ❌ Chroma client initialization in `main()` function
-`find_by_type` handler not updated
-`find_by_file` handler not updated
- ❌ Error handling not comprehensive
- ❌ Logging not fully implemented
#### src/services/sqlite/SessionStore.ts (27 lines added)
**What Works**:
-`getObservationsByIds()` method added (lines 622-645)
- Accepts array of IDs
- Supports temporal ordering (date_desc/date_asc)
- Supports limit parameter
- Uses parameterized queries (SQL injection safe)
#### src/shared/paths.ts (1 line added)
**What Works**:
-`VECTOR_DB_DIR` constant added
- Points to `~/.claude-mem/vector-db/`
- Used by Chroma MCP client
---
## What's Next (Critical Path)
### Immediate Blockers (Must Fix Before Merge)
#### 1. Complete search-server.ts Implementation
**File**: `src/servers/search-server.ts`
**Missing Code**:
a) **Initialize Chroma client in main() function** (~20 lines):
```typescript
// Add to main() function before server.connect()
const chromaTransport = new StdioClientTransport({
command: 'uvx',
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
});
chromaClient = new Client(
{ name: 'claude-mem-search-chroma-client', version: '1.0.0' },
{ capabilities: {} }
);
await chromaClient.connect(chromaTransport);
console.error('[search-server] Chroma client connected');
```
b) **Complete find_by_concept handler** (~30 lines):
- The implementation is cut off mid-function
- Need to complete the semantic ranking logic
- Need to hydrate results from SQLite in semantic rank order
- Need to add error handling and FTS5 fallback
c) **Update find_by_type handler** (~50 lines):
- Same pattern as find_by_concept
- Metadata filter first (SQLite)
- Semantic ranking second (Chroma)
- Preserve rank order in results
d) **Update find_by_file handler** (~50 lines):
- Same pattern as find_by_concept
- File path filter first (SQLite)
- Semantic ranking second (Chroma)
- Preserve rank order in results
**Total Estimated Effort**: 2-3 hours
#### 2. Implement Auto-Sync Service
**NEW File**: `src/services/sync/ChromaSync.ts` (~200 lines)
**Purpose**: Automatically sync new observations to Chroma when worker saves them
**Required Methods**:
```typescript
class ChromaSync {
async syncObservation(obs: Observation): Promise<void>
async syncBatch(observations: Observation[]): Promise<void>
async ensureCollection(): Promise<void>
private async connectChroma(): Promise<void>
private formatObservationDocuments(obs: Observation): ChromaDocument[]
}
```
**Integration Points**:
- `src/services/worker-service.ts` - Call after saving observation to SQLite
- Batch sync on startup for any missing observations
- Use same document format as experiment: `obs_{id}_{field}`
**Total Estimated Effort**: 2-3 hours
#### 3. Build and Validation
**Steps**:
1. Build all scripts: `npm run build`
2. Verify ESM format: `head -1 plugin/scripts/search-server.js`
3. Delete stale builds: `rm -f plugin/scripts/*.cjs`
4. Test sync: `npx tsx experiment/chroma-sync-experiment.ts`
5. Test search: `npx tsx experiment/chroma-search-test.ts`
6. Test MCP server: Start manually and query via MCP inspector
7. Deploy and test in Claude Code session
**Total Estimated Effort**: 1-2 hours
#### 4. Documentation Updates
**Files to Update**:
- `CLAUDE.md` - Add "Hybrid Search Architecture" section
- `CLAUDE.md` - Add "Vector Database Layer" section
- `CHANGELOG.md` - Add v4.4.0 release notes
- Consider: `EXPERIMENTAL_RELEASE_NOTES.md` (as suggested in plan)
**Total Estimated Effort**: 1 hour
---
## Critical Issues & Concerns
### 🔴 Issue #1: Inconclusive Search Quality Results
**Problem**: The experiment results in `RESULTS.md` show **contradictory** data:
- **Header claims**: "Semantic search outperformed by 3 queries (100% vs 63%)"
- **Actual results**: Chroma returned "No results" for 8/8 test queries
- **FTS5 results**: Returned results for 5/8 queries
**Analysis**:
Looking at the actual query results, **every semantic search query failed**:
- Query 1 (conceptual): Chroma ❌ No results, FTS5 ❌ No results
- Query 2 (patterns): Chroma ❌ No results, FTS5 ✅ 1 result
- Query 3 (file): Chroma ❌ No results, FTS5 ✅ 3 results
- Query 4 (function): Chroma ❌ No results, FTS5 ✅ 3 results
- Query 5 (technical): Chroma ❌ No results, FTS5 ❌ No results
- Query 6 (intent): Chroma ❌ No results, FTS5 ✅ 1 result
- Query 7 (error): Chroma ❌ No results, FTS5 ✅ 3 results
- Query 8 (design): Chroma ❌ No results, FTS5 ❌ No results
**Conclusion**: The summary at the top is **incorrect**. FTS5 actually outperformed Chroma 5-0.
**Root Cause Hypothesis**:
- The sync experiment created 8,279 documents from 1,390 observations
- The search test may have run **before** sync completed
- Or search test is using wrong collection name
- Or search test has a query parsing bug
**Action Required**:
- ✅ Re-run sync experiment (verified working above)
- ⚠️ Re-run search test to get accurate results
- ⚠️ Update RESULTS.md with correct findings
- ⚠️ **VALIDATE** that semantic search actually provides value before proceeding
### 🔴 Issue #2: Incomplete Implementation Cut Off Mid-Function
**Problem**: The `find_by_concept` handler in search-server.ts is incomplete (line 554 in diff). The code literally ends with:
```typescript
if (ids.includes(chromaId) && !rankedIds.includes(chromaId)) {
rankedIds.push(chromaId);
}
}
```
**Impact**:
- Handler won't work (syntax error likely)
- Can't test metadata-enhanced search workflows
- Blocks validation of core feature
**Action Required**:
- Complete the handler implementation
- Add error handling
- Add FTS5 fallback
- Test with actual queries
### 🟡 Issue #3: No Auto-Sync Implementation
**Problem**: The ChromaSync service doesn't exist yet. Without it:
- New observations won't appear in semantic search results
- Users must manually run sync experiment after each session
- Chroma database will become stale over time
**Impact**:
- Feature is not production-ready
- User experience is broken (missing recent context)
- Manual intervention required after every coding session
**Action Required**:
- Implement `src/services/sync/ChromaSync.ts`
- Integrate with worker-service.ts
- Add batch sync on startup
- Test sync pipeline end-to-end
### 🟡 Issue #4: Chroma Client Not Initialized
**Problem**: The search-server.ts declares `chromaClient` variable but never initializes it in `main()`.
**Impact**:
- All Chroma queries will fail with "Chroma client not initialized"
- Code will fall back to FTS5 for every query
- Hybrid search feature is effectively disabled
**Action Required**:
- Add client initialization to `main()` function
- Add connection error handling
- Log connection status for debugging
---
## Technical Debt & Concerns
### Design Pattern: Direct MCP Client Usage
**Current Approach**: The implementation uses direct MCP client calls with inline parsing helpers.
**Pros**:
- ✅ No abstraction overhead
- ✅ Parsing logic close to usage
- ✅ Avoids ChromaOrchestrator dead code pattern from experiment/chroma-mcp branch
**Cons**:
- ⚠️ Duplicated parsing logic (queryChroma helper called multiple times)
- ⚠️ Python dict parsing with regex is fragile
- ⚠️ Error handling must be duplicated across handlers
**Recommendation**: Current approach is acceptable, but consider extracting parsing logic to shared utility if it becomes more complex.
### Temporal Boundary: 90-Day Filter
**Current Setting**: Hard-coded 90-day recency window in search_observations handler.
**Concerns**:
- Not configurable
- May be too short for long-running projects
- May be too long for fast-moving projects
- No user control over recency vs semantic relevance trade-off
**Recommendation**: Consider making this configurable via MCP tool parameter in future iteration. For v4.4.0, 90 days is a reasonable default.
### FTS5 Fallback Strategy
**Current Approach**: Each handler tries Chroma first, falls back to FTS5 on error.
**Pros**:
- ✅ Graceful degradation if Chroma unavailable
- ✅ No user-facing errors
**Cons**:
- ⚠️ Silent performance degradation (user doesn't know semantic search failed)
- ⚠️ No metrics on fallback frequency
- ⚠️ Doesn't distinguish between Chroma connection failure vs empty results
**Recommendation**: Add telemetry/logging to track fallback frequency. Consider user-visible warnings if Chroma consistently unavailable.
---
## Validation Checklist (From Plan)
### Pre-Merge Requirements
**Code Completeness**:
- ❌ search-server.ts: Complete all handler implementations
- ❌ search-server.ts: Initialize Chroma client in main()
- ❌ ChromaSync.ts: Implement auto-sync service
- ❌ worker-service.ts: Integrate auto-sync calls
**Testing**:
- ⚠️ Sync experiment works (verified partially above)
- ❌ Search test shows Chroma returning relevant results (currently failing)
- ❌ MCP server starts and responds to queries
- ❌ Fallback to FTS5 works if Chroma unavailable
- ❌ Smoke tests pass (recent work, old concepts, file search, type search)
**Code Quality**:
- ✅ No breaking changes to MCP tool interfaces
- ✅ No dead code (ChromaOrchestrator not present)
- ⚠️ No stale build artifacts (need to verify)
- ❌ No uncommitted changes (will check after completion)
**Documentation**:
- ❌ CLAUDE.md updated with hybrid search architecture
- ❌ CHANGELOG.md has v4.4.0 release notes
- ❌ Experiment results validated and accurate
**Build**:
- ❌ Build succeeds without errors
- ❌ search-server.js is ESM format (not CJS)
- ❌ All hook scripts built correctly
---
## Recommended Next Steps
### Option A: Complete the Implementation (Recommended)
**Timeline**: 6-8 hours total
**Steps**:
1. **Re-validate experiments** (1 hour)
- Delete and re-sync Chroma collection
- Run search test and verify results
- Update RESULTS.md with accurate findings
- **DECISION POINT**: If semantic search doesn't work, stop here
2. **Complete search-server.ts** (2-3 hours)
- Initialize Chroma client
- Complete find_by_concept handler
- Implement find_by_type handler
- Implement find_by_file handler
- Add comprehensive error handling
3. **Implement ChromaSync** (2-3 hours)
- Create src/services/sync/ChromaSync.ts
- Integrate with worker-service.ts
- Test sync pipeline
4. **Validate and Document** (2 hours)
- Build and test MCP server
- Run smoke tests in Claude Code
- Update CLAUDE.md
- Write release notes
5. **Deploy** (30 minutes)
- Merge to main
- Tag v4.4.0
- Deploy to production
### Option B: Pause and Re-Validate (Conservative)
**Timeline**: 2-3 hours
**Steps**:
1. Re-run search quality experiments with fresh sync
2. Get accurate performance comparison data
3. **DECISION**: Proceed with implementation OR abandon feature
4. If abandoning: Document findings, close branch, move on
5. If proceeding: Continue with Option A
### Option C: Ship Minimal Version (Fast Path)
**Timeline**: 4-5 hours
**Steps**:
1. Complete only search_observations handler (skip metadata handlers)
2. Skip auto-sync (keep manual sync experiment)
3. Document as "experimental feature"
4. Merge with feature flag to disable by default
5. Iterate in future versions
---
## File Changes Summary
### Added Files (6)
- `experiment/README.md` (53 lines)
- `experiment/RESULTS.md` (210 lines)
- `experiment/chroma-search-test.ts` (304 lines)
- `experiment/chroma-sync-experiment.ts` (315 lines)
- `FEATURE_PLAN_HYBRID_SEARCH.md` (486 lines)
- `NEXT_SESSION_PROMPT.md` (193 lines)
### Modified Files (10)
- `src/servers/search-server.ts` (+319 lines)
- `src/services/sqlite/SessionStore.ts` (+27 lines)
- `src/shared/paths.ts` (+1 line)
- `plugin/scripts/cleanup-hook.js` (rebuilt)
- `plugin/scripts/context-hook.js` (rebuilt)
- `plugin/scripts/new-hook.js` (rebuilt)
- `plugin/scripts/save-hook.js` (rebuilt)
- `plugin/scripts/search-server.js` (rebuilt)
- `plugin/scripts/summary-hook.js` (rebuilt)
- `plugin/scripts/worker-service.cjs` (rebuilt)
### Files to Create
- `src/services/sync/ChromaSync.ts` (new, ~200 lines)
- `EXPERIMENTAL_RELEASE_NOTES.md` (optional)
### Files to Update
- `CLAUDE.md` (add hybrid search sections)
- `CHANGELOG.md` (add v4.4.0 release notes)
- `experiment/RESULTS.md` (fix incorrect summary)
---
## Timeline Estimate
From FEATURE_PLAN_HYBRID_SEARCH.md:
| Phase | Status | Time Estimate |
|-------|--------|---------------|
| Phase 1: Clean Start | ✅ Complete | 15 min (done) |
| Phase 2: Architecture Review | ✅ Complete | 30 min (done) |
| Phase 3: Implementation | ⚠️ 40% done | 2-3 hours (remaining) |
| Phase 4: Validation | ❌ Not started | 1 hour |
| Phase 5: Documentation | ❌ Not started | 1 hour |
| Phase 6: Deployment | ❌ Not started | 30 min |
| **TOTAL** | **~40% complete** | **~5-6 hours remaining** |
---
## Related Sessions (from claude-mem context)
- **Session #S558**: Critical analysis of experiment/chroma-mcp branch (different branch, has issues)
- **Session #S559**: Critical analysis of THIS branch (identified design validation complete)
- **Session #S560**: Created NEXT_SESSION_PROMPT.md with corrective plan
- **Session #S561**: Attempted to start but NEXT_SESSION_PROMPT.md was missing (now exists)
**Key Observation from Session #2975**:
> "Hybrid Search Architecture Validated for Production Implementation"
However, this appears to be based on the **incorrect** summary in RESULTS.md. The actual test results show Chroma failing all queries. This needs re-validation before proceeding.
---
## Conclusion
The hybrid search feature is **partially implemented** and requires **5-6 hours of focused work** to complete. The most critical blocker is **validating that semantic search actually works** - the current RESULTS.md shows contradictory data.
**Recommended Action**:
1. Re-run search quality experiments with fresh sync
2. Get accurate performance data
3. Make GO/NO-GO decision based on real results
4. If GO: Complete implementation per Option A
5. If NO-GO: Document findings and close branch
**Risk Assessment**:
- 🔴 **HIGH**: Search quality results are contradictory and unvalidated
- 🟡 **MEDIUM**: Implementation is incomplete (missing handlers + auto-sync)
- 🟢 **LOW**: Architecture is sound, experiment scripts work, plan is comprehensive
**Confidence Level**: 60% - The feature CAN work, but needs validation and completion before merge.
+9 -3
View File
@@ -1,6 +1,6 @@
# Chroma MCP Search Experiment Results
**Date**: 2025-11-01T00:18:36.490Z
**Date**: 2025-11-01T03:14:23.093Z
**Project**: claude-mem
**Collection**: cm__claude-mem
@@ -43,9 +43,15 @@ Chroma's vector embeddings successfully handled conceptual queries that FTS5 com
#### 🟡 Keyword Search (FTS5)
**Status**: ✅ Found 1 results
**Status**: ✅ Found 2 results
**Result 1: Semantic search (Chroma) superior to keyword search (FTS5) for memory queries** (discovery)
**Result 1: Search Type Categories Tested: Mechanism, Problem-Solution, and Pattern Queries** (discovery)
```
The session systematically tested both search systems against diverse query types to understand search quality and relevance capabilities. Three primary categories emerged: (1) mechanism/how-to questions seeking explanations of system behavior, (2) problem-solution queries focused on troubleshooting and bug fixes, and (3) pattern/best-practice questions for architectural guidance. Additional testing included specific technical domain queries (context injection, PM2, FTS5) and operational queries (versioning, configuration, error handling). This taxonomy of query types provides a framework for evaluating and comparing search system quality across different information-seeking needs.
```
**Result 2: Semantic search (Chroma) superior to keyword search (FTS5) for memory queries** (discovery)
```
Testing revealed that semantic search via Chroma vastly outperforms traditional full-text search (FTS5) for the memory system use case. Across 8 diverse test queries, Chroma found relevant results in every case while FTS5 succeeded only 38% of the time. The gap is most pronounced for conceptual queries: FTS5 has no mechanism to understand queries like "problems with database synchronization" or "patterns for background workers" without exact keyword matches. Chroma, using vector embeddings, correctly interpreted semantic intent and returned highly relevant results even when exact phrases didn't appear in the database. For exact-match queries, both performed well, but Chroma ranked results by semantic relevance rather than just text occurrence. This data demonstrates semantic search should be the primary interface for memory retrieval.
+1 -1
View File
@@ -2,7 +2,7 @@
"mcpServers": {
"claude-mem-search": {
"type": "stdio",
"command": "${CLAUDE_PLUGIN_ROOT}/scripts/search-server.js"
"command": "${CLAUDE_PLUGIN_ROOT}/scripts/search-server.mjs"
}
}
}
File diff suppressed because one or more lines are too long
+526
View File
@@ -0,0 +1,526 @@
#!/usr/bin/env node
import{Server as Q}from"@modelcontextprotocol/sdk/server/index.js";import{StdioServerTransport as z}from"@modelcontextprotocol/sdk/server/stdio.js";import{Client as Z}from"@modelcontextprotocol/sdk/client/index.js";import{StdioClientTransport as ee}from"@modelcontextprotocol/sdk/client/stdio.js";import{CallToolRequestSchema as se,ListToolsRequestSchema as te}from"@modelcontextprotocol/sdk/types.js";import{z as i}from"zod";import{zodToJsonSchema as re}from"zod-to-json-schema";import{basename as ne}from"path";import K from"better-sqlite3";import{join as g,dirname as W,basename as ue}from"path";import{homedir as j}from"os";import{existsSync as Ee,mkdirSync as q}from"fs";import{fileURLToPath as Y}from"url";function V(){return typeof __dirname<"u"?__dirname:W(Y(import.meta.url))}var fe=V(),b=process.env.CLAUDE_MEM_DATA_DIR||g(j(),".claude-mem"),F=process.env.CLAUDE_CONFIG_DIR||g(j(),".claude"),Te=g(b,"archives"),Se=g(b,"logs"),ge=g(b,"trash"),be=g(b,"backups"),Re=g(b,"settings.json"),I=g(b,"claude-mem.db"),B=g(b,"vector-db"),Oe=g(F,"settings.json"),ye=g(F,"commands"),Ne=g(F,"CLAUDE.md");function x(a){q(a,{recursive:!0})}var C=class{db;constructor(e){e||(x(b),e=I),this.db=new K(e),this.db.pragma("journal_mode = WAL"),this.ensureFTSTables()}ensureFTSTables(){try{if(this.db.prepare("SELECT name FROM sqlite_master WHERE type='table' AND name LIKE '%_fts'").all().some(s=>s.name==="observations_fts"||s.name==="session_summaries_fts"))return;console.error("[SessionSearch] Creating FTS5 tables..."),this.db.exec(`
CREATE VIRTUAL TABLE IF NOT EXISTS observations_fts USING fts5(
title,
subtitle,
narrative,
text,
facts,
concepts,
content='observations',
content_rowid='id'
);
`),this.db.exec(`
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
SELECT id, title, subtitle, narrative, text, facts, concepts
FROM observations;
`),this.db.exec(`
CREATE TRIGGER IF NOT EXISTS observations_ai AFTER INSERT ON observations BEGIN
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
CREATE TRIGGER IF NOT EXISTS observations_ad AFTER DELETE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
END;
CREATE TRIGGER IF NOT EXISTS observations_au AFTER UPDATE ON observations BEGIN
INSERT INTO observations_fts(observations_fts, rowid, title, subtitle, narrative, text, facts, concepts)
VALUES('delete', old.id, old.title, old.subtitle, old.narrative, old.text, old.facts, old.concepts);
INSERT INTO observations_fts(rowid, title, subtitle, narrative, text, facts, concepts)
VALUES (new.id, new.title, new.subtitle, new.narrative, new.text, new.facts, new.concepts);
END;
`),this.db.exec(`
CREATE VIRTUAL TABLE IF NOT EXISTS session_summaries_fts USING fts5(
request,
investigated,
learned,
completed,
next_steps,
notes,
content='session_summaries',
content_rowid='id'
);
`),this.db.exec(`
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
SELECT id, request, investigated, learned, completed, next_steps, notes
FROM session_summaries;
`),this.db.exec(`
CREATE TRIGGER IF NOT EXISTS session_summaries_ai AFTER INSERT ON session_summaries BEGIN
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
CREATE TRIGGER IF NOT EXISTS session_summaries_ad AFTER DELETE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
END;
CREATE TRIGGER IF NOT EXISTS session_summaries_au AFTER UPDATE ON session_summaries BEGIN
INSERT INTO session_summaries_fts(session_summaries_fts, rowid, request, investigated, learned, completed, next_steps, notes)
VALUES('delete', old.id, old.request, old.investigated, old.learned, old.completed, old.next_steps, old.notes);
INSERT INTO session_summaries_fts(rowid, request, investigated, learned, completed, next_steps, notes)
VALUES (new.id, new.request, new.investigated, new.learned, new.completed, new.next_steps, new.notes);
END;
`),console.error("[SessionSearch] FTS5 tables created successfully")}catch(e){console.error("[SessionSearch] FTS migration error:",e.message)}}escapeFTS5(e){return`"${e.replace(/"/g,'""')}"`}buildFilterClause(e,r,s="o"){let t=[];if(e.project&&(t.push(`${s}.project = ?`),r.push(e.project)),e.type)if(Array.isArray(e.type)){let o=e.type.map(()=>"?").join(",");t.push(`${s}.type IN (${o})`),r.push(...e.type)}else t.push(`${s}.type = ?`),r.push(e.type);if(e.dateRange){let{start:o,end:n}=e.dateRange;if(o){let c=typeof o=="number"?o:new Date(o).getTime();t.push(`${s}.created_at_epoch >= ?`),r.push(c)}if(n){let c=typeof n=="number"?n:new Date(n).getTime();t.push(`${s}.created_at_epoch <= ?`),r.push(c)}}if(e.concepts){let o=Array.isArray(e.concepts)?e.concepts:[e.concepts],n=o.map(()=>`EXISTS (SELECT 1 FROM json_each(${s}.concepts) WHERE value = ?)`);n.length>0&&(t.push(`(${n.join(" OR ")})`),r.push(...o))}if(e.files){let o=Array.isArray(e.files)?e.files:[e.files],n=o.map(()=>`(
EXISTS (SELECT 1 FROM json_each(${s}.files_read) WHERE value LIKE ?)
OR EXISTS (SELECT 1 FROM json_each(${s}.files_modified) WHERE value LIKE ?)
)`);n.length>0&&(t.push(`(${n.join(" OR ")})`),o.forEach(c=>{r.push(`%${c}%`,`%${c}%`)}))}return t.length>0?t.join(" AND "):""}buildOrderClause(e="relevance",r=!0,s="observations_fts"){switch(e){case"relevance":return r?`ORDER BY ${s}.rank ASC`:"ORDER BY o.created_at_epoch DESC";case"date_desc":return"ORDER BY o.created_at_epoch DESC";case"date_asc":return"ORDER BY o.created_at_epoch ASC";default:return"ORDER BY o.created_at_epoch DESC"}}searchObservations(e,r={}){let s=[],{limit:t=50,offset:o=0,orderBy:n="relevance",...c}=r,d=this.escapeFTS5(e);s.push(d);let l=this.buildFilterClause(c,s,"o"),u=l?`AND ${l}`:"",p=this.buildOrderClause(n,!0),m=`
SELECT
o.*,
observations_fts.rank as rank
FROM observations o
JOIN observations_fts ON o.id = observations_fts.rowid
WHERE observations_fts MATCH ?
${u}
${p}
LIMIT ? OFFSET ?
`;s.push(t,o);let _=this.db.prepare(m).all(...s);if(_.length>0){let E=Math.min(..._.map(f=>f.rank||0)),T=Math.max(..._.map(f=>f.rank||0))-E||1;_.forEach(f=>{f.rank!==void 0&&(f.score=1-(f.rank-E)/T)})}return _}searchSessions(e,r={}){let s=[],{limit:t=50,offset:o=0,orderBy:n="relevance",...c}=r,d=this.escapeFTS5(e);s.push(d);let l={...c};delete l.type;let u=this.buildFilterClause(l,s,"s"),E=`
SELECT
s.*,
session_summaries_fts.rank as rank
FROM session_summaries s
JOIN session_summaries_fts ON s.id = session_summaries_fts.rowid
WHERE session_summaries_fts MATCH ?
${(u?`AND ${u}`:"").replace(/files_read/g,"files_read").replace(/files_modified/g,"files_edited")}
${n==="relevance"?"ORDER BY session_summaries_fts.rank ASC":n==="date_asc"?"ORDER BY s.created_at_epoch ASC":"ORDER BY s.created_at_epoch DESC"}
LIMIT ? OFFSET ?
`;s.push(t,o);let h=this.db.prepare(E).all(...s);if(h.length>0){let T=Math.min(...h.map(S=>S.rank||0)),O=Math.max(...h.map(S=>S.rank||0))-T||1;h.forEach(S=>{S.rank!==void 0&&(S.score=1-(S.rank-T)/O)})}return h}findByConcept(e,r={}){let s=[],{limit:t=50,offset:o=0,orderBy:n="date_desc",...c}=r,d={...c,concepts:e},l=this.buildFilterClause(d,s,"o"),u=this.buildOrderClause(n,!1),p=`
SELECT o.*
FROM observations o
WHERE ${l}
${u}
LIMIT ? OFFSET ?
`;return s.push(t,o),this.db.prepare(p).all(...s)}findByFile(e,r={}){let s=[],{limit:t=50,offset:o=0,orderBy:n="date_desc",...c}=r,d={...c,files:e},l=this.buildFilterClause(d,s,"o"),u=this.buildOrderClause(n,!1),p=`
SELECT o.*
FROM observations o
WHERE ${l}
${u}
LIMIT ? OFFSET ?
`;s.push(t,o);let m=this.db.prepare(p).all(...s),_=[],E={...c};delete E.type;let h=[];if(E.project&&(h.push("s.project = ?"),_.push(E.project)),E.dateRange){let{start:O,end:S}=E.dateRange;if(O){let k=typeof O=="number"?O:new Date(O).getTime();h.push("s.created_at_epoch >= ?"),_.push(k)}if(S){let k=typeof S=="number"?S:new Date(S).getTime();h.push("s.created_at_epoch <= ?"),_.push(k)}}h.push(`(
EXISTS (SELECT 1 FROM json_each(s.files_read) WHERE value LIKE ?)
OR EXISTS (SELECT 1 FROM json_each(s.files_edited) WHERE value LIKE ?)
)`),_.push(`%${e}%`,`%${e}%`);let T=`
SELECT s.*
FROM session_summaries s
WHERE ${h.join(" AND ")}
ORDER BY s.created_at_epoch DESC
LIMIT ? OFFSET ?
`;_.push(t,o);let f=this.db.prepare(T).all(..._);return{observations:m,sessions:f}}findByType(e,r={}){let s=[],{limit:t=50,offset:o=0,orderBy:n="date_desc",...c}=r,d={...c,type:e},l=this.buildFilterClause(d,s,"o"),u=this.buildOrderClause(n,!1),p=`
SELECT o.*
FROM observations o
WHERE ${l}
${u}
LIMIT ? OFFSET ?
`;return s.push(t,o),this.db.prepare(p).all(...s)}searchUserPrompts(e,r={}){let s=[],{limit:t=20,offset:o=0,orderBy:n="relevance",...c}=r,d=this.escapeFTS5(e);s.push(d);let l=[];if(c.project&&(l.push("s.project = ?"),s.push(c.project)),c.dateRange){let{start:E,end:h}=c.dateRange;if(E){let T=typeof E=="number"?E:new Date(E).getTime();l.push("up.created_at_epoch >= ?"),s.push(T)}if(h){let T=typeof h=="number"?h:new Date(h).getTime();l.push("up.created_at_epoch <= ?"),s.push(T)}}let m=`
SELECT
up.*,
user_prompts_fts.rank as rank
FROM user_prompts up
JOIN user_prompts_fts ON up.id = user_prompts_fts.rowid
JOIN sdk_sessions s ON up.claude_session_id = s.claude_session_id
WHERE user_prompts_fts MATCH ?
${l.length>0?`AND ${l.join(" AND ")}`:""}
${n==="relevance"?"ORDER BY user_prompts_fts.rank ASC":n==="date_asc"?"ORDER BY up.created_at_epoch ASC":"ORDER BY up.created_at_epoch DESC"}
LIMIT ? OFFSET ?
`;s.push(t,o);let _=this.db.prepare(m).all(...s);if(_.length>0){let E=Math.min(..._.map(f=>f.rank||0)),T=Math.max(..._.map(f=>f.rank||0))-E||1;_.forEach(f=>{f.rank!==void 0&&(f.score=1-(f.rank-E)/T)})}return _}getUserPromptsBySession(e){return this.db.prepare(`
SELECT
id,
claude_session_id,
prompt_number,
prompt_text,
created_at,
created_at_epoch
FROM user_prompts
WHERE claude_session_id = ?
ORDER BY prompt_number ASC
`).all(e)}close(){this.db.close()}};import J from"better-sqlite3";var $=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))($||{}),U=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=$[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,r){return`obs-${e}-${r}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let r=Object.keys(e);return r.length===0?"{}":r.length<=3?JSON.stringify(e):`{${r.length} keys: ${r.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,r){if(!r)return e;try{let s=typeof r=="string"?JSON.parse(r):r;if(e==="Bash"&&s.command){let t=s.command.length>50?s.command.substring(0,50)+"...":s.command;return`${e}(${t})`}if(e==="Read"&&s.file_path){let t=s.file_path.split("/").pop()||s.file_path;return`${e}(${t})`}if(e==="Edit"&&s.file_path){let t=s.file_path.split("/").pop()||s.file_path;return`${e}(${t})`}if(e==="Write"&&s.file_path){let t=s.file_path.split("/").pop()||s.file_path;return`${e}(${t})`}return e}catch{return e}}log(e,r,s,t,o){if(e<this.level)return;let n=new Date().toISOString().replace("T"," ").substring(0,23),c=$[e].padEnd(5),d=r.padEnd(6),l="";t?.correlationId?l=`[${t.correlationId}] `:t?.sessionId&&(l=`[session-${t.sessionId}] `);let u="";o!=null&&(this.level===0&&typeof o=="object"?u=`
`+JSON.stringify(o,null,2):u=" "+this.formatData(o));let p="";if(t){let{sessionId:_,sdkSessionId:E,correlationId:h,...T}=t;Object.keys(T).length>0&&(p=` {${Object.entries(T).map(([O,S])=>`${O}=${S}`).join(", ")}}`)}let m=`[${n}] [${c}] [${d}] ${l}${s}${p}${u}`;e===3?console.error(m):console.log(m)}debug(e,r,s,t){this.log(0,e,r,s,t)}info(e,r,s,t){this.log(1,e,r,s,t)}warn(e,r,s,t){this.log(2,e,r,s,t)}error(e,r,s,t){this.log(3,e,r,s,t)}dataIn(e,r,s,t){this.info(e,`\u2192 ${r}`,s,t)}dataOut(e,r,s,t){this.info(e,`\u2190 ${r}`,s,t)}success(e,r,s,t){this.info(e,`\u2713 ${r}`,s,t)}failure(e,r,s,t){this.error(e,`\u2717 ${r}`,s,t)}timing(e,r,s,t){this.info(e,`\u23F1 ${r}`,t,{duration:`${s}ms`})}},X=new U;var L=class{db;constructor(){x(b),this.db=new J(I),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
CREATE TABLE IF NOT EXISTS schema_versions (
id INTEGER PRIMARY KEY,
version INTEGER UNIQUE NOT NULL,
applied_at TEXT NOT NULL
)
`);let e=this.db.prepare("SELECT version FROM schema_versions ORDER BY version").all();(e.length>0?Math.max(...e.map(s=>s.version)):0)===0&&(console.error("[SessionStore] Initializing fresh database with migration004..."),this.db.exec(`
CREATE TABLE IF NOT EXISTS sdk_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
claude_session_id TEXT UNIQUE NOT NULL,
sdk_session_id TEXT UNIQUE,
project TEXT NOT NULL,
user_prompt TEXT,
started_at TEXT NOT NULL,
started_at_epoch INTEGER NOT NULL,
completed_at TEXT,
completed_at_epoch INTEGER,
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(claude_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(sdk_session_id);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
sdk_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT NOT NULL,
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery')),
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(sdk_session_id);
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
CREATE TABLE IF NOT EXISTS session_summaries (
id INTEGER PRIMARY KEY AUTOINCREMENT,
sdk_session_id TEXT UNIQUE NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`),this.db.prepare("INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)").run(4,new Date().toISOString()),console.error("[SessionStore] Migration004 applied successfully"))}catch(e){throw console.error("[SessionStore] Schema initialization error:",e.message),e}}ensureWorkerPortColumn(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(5))return;this.db.pragma("table_info(sdk_sessions)").some(t=>t.name==="worker_port")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER"),console.error("[SessionStore] Added worker_port column to sdk_sessions table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(5,new Date().toISOString())}catch(e){console.error("[SessionStore] Migration error:",e.message)}}ensurePromptTrackingColumns(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(6))return;this.db.pragma("table_info(sdk_sessions)").some(d=>d.name==="prompt_counter")||(this.db.exec("ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0"),console.error("[SessionStore] Added prompt_counter column to sdk_sessions table")),this.db.pragma("table_info(observations)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE observations ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to observations table")),this.db.pragma("table_info(session_summaries)").some(d=>d.name==="prompt_number")||(this.db.exec("ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER"),console.error("[SessionStore] Added prompt_number column to session_summaries table")),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(6,new Date().toISOString())}catch(e){console.error("[SessionStore] Prompt tracking migration error:",e.message)}}removeSessionSummariesUniqueConstraint(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(7))return;if(!this.db.pragma("index_list(session_summaries)").some(t=>t.unique===1)){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString());return}console.error("[SessionStore] Removing UNIQUE constraint from session_summaries.sdk_session_id..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
CREATE TABLE session_summaries_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
sdk_session_id TEXT NOT NULL,
project TEXT NOT NULL,
request TEXT,
investigated TEXT,
learned TEXT,
completed TEXT,
next_steps TEXT,
files_read TEXT,
files_edited TEXT,
notes TEXT,
prompt_number INTEGER,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
)
`),this.db.exec(`
INSERT INTO session_summaries_new
SELECT id, sdk_session_id, project, request, investigated, learned,
completed, next_steps, files_read, files_edited, notes,
prompt_number, created_at, created_at_epoch
FROM session_summaries
`),this.db.exec("DROP TABLE session_summaries"),this.db.exec("ALTER TABLE session_summaries_new RENAME TO session_summaries"),this.db.exec(`
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(sdk_session_id);
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
`),this.db.exec("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(7,new Date().toISOString()),console.error("[SessionStore] Successfully removed UNIQUE constraint from session_summaries.sdk_session_id")}catch(t){throw this.db.exec("ROLLBACK"),t}}catch(e){console.error("[SessionStore] Migration error (remove UNIQUE constraint):",e.message)}}addObservationHierarchicalFields(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(8))return;if(this.db.pragma("table_info(observations)").some(t=>t.name==="title")){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(8,new Date().toISOString());return}console.error("[SessionStore] Adding hierarchical fields to observations table..."),this.db.exec(`
ALTER TABLE observations ADD COLUMN title TEXT;
ALTER TABLE observations ADD COLUMN subtitle TEXT;
ALTER TABLE observations ADD COLUMN facts TEXT;
ALTER TABLE observations ADD COLUMN narrative TEXT;
ALTER TABLE observations ADD COLUMN concepts TEXT;
ALTER TABLE observations ADD COLUMN files_read TEXT;
ALTER TABLE observations ADD COLUMN files_modified TEXT;
`),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(8,new Date().toISOString()),console.error("[SessionStore] Successfully added hierarchical fields to observations table")}catch(e){console.error("[SessionStore] Migration error (add hierarchical fields):",e.message)}}makeObservationsTextNullable(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(9))return;let s=this.db.pragma("table_info(observations)").find(t=>t.name==="text");if(!s||s.notnull===0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(9,new Date().toISOString());return}console.error("[SessionStore] Making observations.text nullable..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
CREATE TABLE observations_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
sdk_session_id TEXT NOT NULL,
project TEXT NOT NULL,
text TEXT,
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change')),
title TEXT,
subtitle TEXT,
facts TEXT,
narrative TEXT,
concepts TEXT,
files_read TEXT,
files_modified TEXT,
prompt_number INTEGER,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(sdk_session_id) REFERENCES sdk_sessions(sdk_session_id) ON DELETE CASCADE
)
`),this.db.exec(`
INSERT INTO observations_new
SELECT id, sdk_session_id, project, text, type, title, subtitle, facts,
narrative, concepts, files_read, files_modified, prompt_number,
created_at, created_at_epoch
FROM observations
`),this.db.exec("DROP TABLE observations"),this.db.exec("ALTER TABLE observations_new RENAME TO observations"),this.db.exec(`
CREATE INDEX idx_observations_sdk_session ON observations(sdk_session_id);
CREATE INDEX idx_observations_project ON observations(project);
CREATE INDEX idx_observations_type ON observations(type);
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
`),this.db.exec("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(9,new Date().toISOString()),console.error("[SessionStore] Successfully made observations.text nullable")}catch(t){throw this.db.exec("ROLLBACK"),t}}catch(e){console.error("[SessionStore] Migration error (make text nullable):",e.message)}}createUserPromptsTable(){try{if(this.db.prepare("SELECT version FROM schema_versions WHERE version = ?").get(10))return;if(this.db.pragma("table_info(user_prompts)").length>0){this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(10,new Date().toISOString());return}console.error("[SessionStore] Creating user_prompts table with FTS5 support..."),this.db.exec("BEGIN TRANSACTION");try{this.db.exec(`
CREATE TABLE user_prompts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
claude_session_id TEXT NOT NULL,
prompt_number INTEGER NOT NULL,
prompt_text TEXT NOT NULL,
created_at TEXT NOT NULL,
created_at_epoch INTEGER NOT NULL,
FOREIGN KEY(claude_session_id) REFERENCES sdk_sessions(claude_session_id) ON DELETE CASCADE
);
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(claude_session_id);
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
`),this.db.exec(`
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
prompt_text,
content='user_prompts',
content_rowid='id'
);
`),this.db.exec(`
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
END;
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
VALUES('delete', old.id, old.prompt_text);
INSERT INTO user_prompts_fts(rowid, prompt_text)
VALUES (new.id, new.prompt_text);
END;
`),this.db.exec("COMMIT"),this.db.prepare("INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)").run(10,new Date().toISOString()),console.error("[SessionStore] Successfully created user_prompts table with FTS5 support")}catch(s){throw this.db.exec("ROLLBACK"),s}}catch(e){console.error("[SessionStore] Migration error (create user_prompts table):",e.message)}}getRecentSummaries(e,r=10){return this.db.prepare(`
SELECT
request, investigated, learned, completed, next_steps,
files_read, files_edited, notes, prompt_number, created_at
FROM session_summaries
WHERE project = ?
ORDER BY created_at_epoch DESC
LIMIT ?
`).all(e,r)}getRecentSummariesWithSessionInfo(e,r=3){return this.db.prepare(`
SELECT
sdk_session_id, request, learned, completed, next_steps,
prompt_number, created_at
FROM session_summaries
WHERE project = ?
ORDER BY created_at_epoch DESC
LIMIT ?
`).all(e,r)}getRecentObservations(e,r=20){return this.db.prepare(`
SELECT type, text, prompt_number, created_at
FROM observations
WHERE project = ?
ORDER BY created_at_epoch DESC
LIMIT ?
`).all(e,r)}getRecentSessionsWithStatus(e,r=3){return this.db.prepare(`
SELECT * FROM (
SELECT
s.sdk_session_id,
s.status,
s.started_at,
s.started_at_epoch,
s.user_prompt,
CASE WHEN sum.sdk_session_id IS NOT NULL THEN 1 ELSE 0 END as has_summary
FROM sdk_sessions s
LEFT JOIN session_summaries sum ON s.sdk_session_id = sum.sdk_session_id
WHERE s.project = ? AND s.sdk_session_id IS NOT NULL
GROUP BY s.sdk_session_id
ORDER BY s.started_at_epoch DESC
LIMIT ?
)
ORDER BY started_at_epoch ASC
`).all(e,r)}getObservationsForSession(e){return this.db.prepare(`
SELECT title, subtitle, type, prompt_number
FROM observations
WHERE sdk_session_id = ?
ORDER BY created_at_epoch ASC
`).all(e)}getObservationsByIds(e,r={}){if(e.length===0)return[];let{orderBy:s="date_desc",limit:t}=r,o=s==="date_asc"?"ASC":"DESC",n=t?`LIMIT ${t}`:"",c=e.map(()=>"?").join(",");return this.db.prepare(`
SELECT *
FROM observations
WHERE id IN (${c})
ORDER BY created_at_epoch ${o}
${n}
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
SELECT
request, investigated, learned, completed, next_steps,
files_read, files_edited, notes, prompt_number, created_at
FROM session_summaries
WHERE sdk_session_id = ?
ORDER BY created_at_epoch DESC
LIMIT 1
`).get(e)||null}getFilesForSession(e){let s=this.db.prepare(`
SELECT files_read, files_modified
FROM observations
WHERE sdk_session_id = ?
`).all(e),t=new Set,o=new Set;for(let n of s){if(n.files_read)try{let c=JSON.parse(n.files_read);Array.isArray(c)&&c.forEach(d=>t.add(d))}catch{}if(n.files_modified)try{let c=JSON.parse(n.files_modified);Array.isArray(c)&&c.forEach(d=>o.add(d))}catch{}}return{filesRead:Array.from(t),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
FROM sdk_sessions
WHERE id = ?
LIMIT 1
`).get(e)||null}findActiveSDKSession(e){return this.db.prepare(`
SELECT id, sdk_session_id, project, worker_port
FROM sdk_sessions
WHERE claude_session_id = ? AND status = 'active'
LIMIT 1
`).get(e)||null}findAnySDKSession(e){return this.db.prepare(`
SELECT id
FROM sdk_sessions
WHERE claude_session_id = ?
LIMIT 1
`).get(e)||null}reactivateSession(e,r){this.db.prepare(`
UPDATE sdk_sessions
SET status = 'active', user_prompt = ?, worker_port = NULL
WHERE id = ?
`).run(r,e)}incrementPromptCounter(e){return this.db.prepare(`
UPDATE sdk_sessions
SET prompt_counter = COALESCE(prompt_counter, 0) + 1
WHERE id = ?
`).run(e),this.db.prepare(`
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
`).get(e)?.prompt_counter||0}createSDKSession(e,r,s){let t=new Date,o=t.getTime(),c=this.db.prepare(`
INSERT OR IGNORE INTO sdk_sessions
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
VALUES (?, ?, ?, ?, ?, ?, 'active')
`).run(e,e,r,s,t.toISOString(),o);return c.lastInsertRowid===0||c.changes===0?this.db.prepare(`
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
`).get(e).id:c.lastInsertRowid}updateSDKSessionId(e,r){return this.db.prepare(`
UPDATE sdk_sessions
SET sdk_session_id = ?
WHERE id = ? AND sdk_session_id IS NULL
`).run(r,e).changes===0?(X.debug("DB","sdk_session_id already set, skipping update",{sessionId:e,sdkSessionId:r}),!1):!0}setWorkerPort(e,r){this.db.prepare(`
UPDATE sdk_sessions
SET worker_port = ?
WHERE id = ?
`).run(r,e)}getWorkerPort(e){return this.db.prepare(`
SELECT worker_port
FROM sdk_sessions
WHERE id = ?
LIMIT 1
`).get(e)?.worker_port||null}saveUserPrompt(e,r,s){let t=new Date,o=t.getTime();return this.db.prepare(`
INSERT INTO user_prompts
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?)
`).run(e,r,s,t.toISOString(),o).lastInsertRowid}storeObservation(e,r,s,t){let o=new Date,n=o.getTime();this.db.prepare(`
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
`).get(e)||(this.db.prepare(`
INSERT INTO sdk_sessions
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
VALUES (?, ?, ?, ?, ?, 'active')
`).run(e,e,r,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
INSERT INTO observations
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
files_read, files_modified, prompt_number, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(e,r,s.type,s.title,s.subtitle,JSON.stringify(s.facts),s.narrative,JSON.stringify(s.concepts),JSON.stringify(s.files_read),JSON.stringify(s.files_modified),t||null,o.toISOString(),n)}storeSummary(e,r,s,t){let o=new Date,n=o.getTime();this.db.prepare(`
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
`).get(e)||(this.db.prepare(`
INSERT INTO sdk_sessions
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
VALUES (?, ?, ?, ?, ?, 'active')
`).run(e,e,r,o.toISOString(),n),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
INSERT INTO session_summaries
(sdk_session_id, project, request, investigated, learned, completed,
next_steps, notes, prompt_number, created_at, created_at_epoch)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(e,r,s.request,s.investigated,s.learned,s.completed,s.next_steps,s.notes,t||null,o.toISOString(),n)}markSessionCompleted(e){let r=new Date,s=r.getTime();this.db.prepare(`
UPDATE sdk_sessions
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
WHERE id = ?
`).run(r.toISOString(),s,e)}markSessionFailed(e){let r=new Date,s=r.getTime();this.db.prepare(`
UPDATE sdk_sessions
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
WHERE id = ?
`).run(r.toISOString(),s,e)}cleanupOrphanedSessions(){let e=new Date,r=e.getTime();return this.db.prepare(`
UPDATE sdk_sessions
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
WHERE status = 'active'
`).run(e.toISOString(),r).changes}close(){this.db.close()}};var R,y,N=null,oe="cm__claude-mem";try{R=new C,y=new L}catch(a){console.error("[search-server] Failed to initialize search:",a.message),process.exit(1)}async function A(a,e,r){if(!N)throw new Error("Chroma client not initialized");let t=(await N.callTool({name:"chroma_query_documents",arguments:{collection_name:oe,query_texts:[a],n_results:e,include:["documents","metadatas","distances"],where:r}})).content[0]?.text||"",o;try{o=JSON.parse(t)}catch(u){return console.error("[search-server] Failed to parse Chroma response as JSON:",u),{ids:[],distances:[],metadatas:[]}}let n=[],c=o.ids?.[0]||[];for(let u of c){let p=u.match(/obs_(\d+)_/);if(p){let m=parseInt(p[1],10);n.includes(m)||n.push(m)}}let d=o.distances?.[0]||[],l=o.metadatas?.[0]||[];return{ids:n,distances:d,metadatas:l}}function v(){return`
---
\u{1F4A1} Search Strategy:
ALWAYS search with index format FIRST to get an overview and identify relevant results.
This is critical for token efficiency - index format uses ~10x fewer tokens than full format.
Search workflow:
1. Initial search: Use default (index) format to see titles, dates, and sources
2. Review results: Identify which items are most relevant to your needs
3. Deep dive: Only then use format: "full" on specific items of interest
4. Narrow down: Use filters (type, dateRange, concepts, files) to refine results
Other tips:
\u2022 To search by concept: Use find_by_concept tool
\u2022 To browse by type: Use find_by_type with ["decision", "feature", etc.]
\u2022 To sort by date: Use orderBy: "date_desc" or "date_asc"`}function D(a,e){let r=a.title||`Observation #${a.id}`,s=new Date(a.created_at_epoch).toLocaleString(),t=a.type?`[${a.type}]`:"";return`${e+1}. ${t} ${r}
Date: ${s}
Source: claude-mem://observation/${a.id}`}function P(a,e){let r=a.request||`Session ${a.sdk_session_id.substring(0,8)}`,s=new Date(a.created_at_epoch).toLocaleString();return`${e+1}. ${r}
Date: ${s}
Source: claude-mem://session/${a.sdk_session_id}`}function w(a,e){let r=a.title||`Observation #${a.id}`,s=[];s.push(`## ${r}`),s.push(`*Source: claude-mem://observation/${a.id}*`),s.push(""),a.subtitle&&(s.push(`**${a.subtitle}**`),s.push("")),a.narrative&&(s.push(a.narrative),s.push("")),a.text&&(s.push(a.text),s.push(""));let t=[];if(t.push(`Type: ${a.type}`),a.facts)try{let n=JSON.parse(a.facts);n.length>0&&t.push(`Facts: ${n.join("; ")}`)}catch{}if(a.concepts)try{let n=JSON.parse(a.concepts);n.length>0&&t.push(`Concepts: ${n.join(", ")}`)}catch{}if(a.files_read||a.files_modified){let n=[];if(a.files_read)try{n.push(...JSON.parse(a.files_read))}catch{}if(a.files_modified)try{n.push(...JSON.parse(a.files_modified))}catch{}n.length>0&&t.push(`Files: ${[...new Set(n)].join(", ")}`)}t.length>0&&(s.push("---"),s.push(t.join(" | ")));let o=new Date(a.created_at_epoch).toLocaleString();return s.push(""),s.push("---"),s.push(`Date: ${o}`),s.join(`
`)}function G(a,e){let r=a.request||`Session ${a.sdk_session_id.substring(0,8)}`,s=[];s.push(`## ${r}`),s.push(`*Source: claude-mem://session/${a.sdk_session_id}*`),s.push(""),a.completed&&(s.push(`**Completed:** ${a.completed}`),s.push("")),a.learned&&(s.push(`**Learned:** ${a.learned}`),s.push("")),a.investigated&&(s.push(`**Investigated:** ${a.investigated}`),s.push("")),a.next_steps&&(s.push(`**Next Steps:** ${a.next_steps}`),s.push("")),a.notes&&(s.push(`**Notes:** ${a.notes}`),s.push(""));let t=[];if(a.files_read||a.files_edited){let n=[];if(a.files_read)try{n.push(...JSON.parse(a.files_read))}catch{}if(a.files_edited)try{n.push(...JSON.parse(a.files_edited))}catch{}n.length>0&&t.push(`Files: ${[...new Set(n)].join(", ")}`)}let o=new Date(a.created_at_epoch).toLocaleDateString();return t.push(`Date: ${o}`),t.length>0&&(s.push("---"),s.push(t.join(" | "))),s.join(`
`)}function ie(a,e){let r=a.prompt_text.length>100?a.prompt_text.substring(0,100)+"...":a.prompt_text,s=new Date(a.created_at_epoch).toLocaleString();return`${e+1}. "${r}"
Date: ${s} | Prompt #${a.prompt_number}
Source: claude-mem://user-prompt/${a.id}`}function ae(a,e){let r=[];r.push(`## User Prompt #${a.prompt_number}`),r.push(`*Source: claude-mem://user-prompt/${a.id}*`),r.push(""),r.push(a.prompt_text),r.push(""),r.push("---");let s=new Date(a.created_at_epoch).toLocaleString();return r.push(`Date: ${s}`),r.join(`
`)}var ce=i.object({project:i.string().optional().describe("Filter by project name"),type:i.union([i.enum(["decision","bugfix","feature","refactor","discovery","change"]),i.array(i.enum(["decision","bugfix","feature","refactor","discovery","change"]))]).optional().describe("Filter by observation type"),concepts:i.union([i.string(),i.array(i.string())]).optional().describe("Filter by concept tags"),files:i.union([i.string(),i.array(i.string())]).optional().describe("Filter by file paths (partial match)"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional().describe("Start date (ISO string or epoch)"),end:i.union([i.string(),i.number()]).optional().describe("End date (ISO string or epoch)")}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum number of results"),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("relevance").describe("Sort order")}),H=[{name:"search_observations",description:'Search observations using full-text search across titles, narratives, facts, and concepts. IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({query:i.string().describe("Search query for FTS5 full-text search"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for titles/dates only (default, RECOMMENDED for initial search), "full" for complete details (use only after reviewing index results)'),...ce.shape}),handler:async a=>{try{let{query:e,format:r="index",...s}=a,t=[];if(N)try{console.error("[search-server] Using hybrid semantic search (Chroma + SQLite)");let n=await A(e,100);if(console.error(`[search-server] Chroma returned ${n.ids.length} semantic matches`),n.ids.length>0){let c=Math.floor(Date.now()/1e3)-7776e3,d=n.ids.filter((l,u)=>{let p=n.metadatas[u];return p&&p.created_at_epoch>c});if(console.error(`[search-server] ${d.length} results within 90-day window`),d.length>0){let l=s.limit||20;t=y.getObservationsByIds(d,{orderBy:"date_desc",limit:l}),console.error(`[search-server] Hydrated ${t.length} observations from SQLite`)}}}catch(n){console.error("[search-server] Chroma query failed, falling back to FTS5:",n.message)}if(t.length===0&&(console.error("[search-server] Using FTS5 keyword search"),t=R.searchObservations(e,s)),t.length===0)return{content:[{type:"text",text:`No observations found matching "${e}"`}]};let o;if(r==="index"){let n=`Found ${t.length} observation(s) matching "${e}":
`,c=t.map((d,l)=>D(d,l));o=n+c.join(`
`)+v()}else o=t.map((c,d)=>w(c,d)).join(`
---
`);return{content:[{type:"text",text:o}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}},{name:"search_sessions",description:'Search session summaries using full-text search across requests, completions, learnings, and notes. IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({query:i.string().describe("Search query for FTS5 full-text search"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for titles/dates only (default, RECOMMENDED for initial search), "full" for complete details (use only after reviewing index results)'),project:i.string().optional().describe("Filter by project name"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional(),end:i.union([i.string(),i.number()]).optional()}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum number of results"),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("relevance").describe("Sort order")}),handler:async a=>{try{let{query:e,format:r="index",...s}=a,t=R.searchSessions(e,s);if(t.length===0)return{content:[{type:"text",text:`No sessions found matching "${e}"`}]};let o;if(r==="index"){let n=`Found ${t.length} session(s) matching "${e}":
`,c=t.map((d,l)=>P(d,l));o=n+c.join(`
`)+v()}else o=t.map((c,d)=>G(c,d)).join(`
---
`);return{content:[{type:"text",text:o}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}},{name:"find_by_concept",description:'Find observations tagged with a specific concept. IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({concept:i.string().describe("Concept tag to search for"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for titles/dates only (default, RECOMMENDED for initial search), "full" for complete details (use only after reviewing index results)'),project:i.string().optional().describe("Filter by project name"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional(),end:i.union([i.string(),i.number()]).optional()}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum results. IMPORTANT: Start with 3-5 to avoid exceeding MCP token limits, even in index mode."),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("date_desc").describe("Sort order")}),handler:async a=>{try{let{concept:e,format:r="index",...s}=a,t=[];if(N)try{console.error("[search-server] Using metadata-first + semantic ranking for concept search");let n=R.findByConcept(e,s);if(console.error(`[search-server] Found ${n.length} observations with concept "${e}"`),n.length>0){let c=n.map(u=>u.id),d=await A(e,Math.min(c.length,100)),l=[];for(let u of d.ids)c.includes(u)&&!l.includes(u)&&l.push(u);console.error(`[search-server] Chroma ranked ${l.length} results by semantic relevance`),l.length>0&&(t=y.getObservationsByIds(l,{limit:s.limit||20}),t.sort((u,p)=>l.indexOf(u.id)-l.indexOf(p.id)))}}catch(n){console.error("[search-server] Chroma ranking failed, using SQLite order:",n.message)}if(t.length===0&&(console.error("[search-server] Using SQLite-only concept search"),t=R.findByConcept(e,s)),t.length===0)return{content:[{type:"text",text:`No observations found with concept "${e}"`}]};let o;if(r==="index"){let n=`Found ${t.length} observation(s) with concept "${e}":
`,c=t.map((d,l)=>D(d,l));o=n+c.join(`
`)+v()}else o=t.map((c,d)=>w(c,d)).join(`
---
`);return{content:[{type:"text",text:o}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}},{name:"find_by_file",description:'Find observations and sessions that reference a specific file path. IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({filePath:i.string().describe("File path to search for (supports partial matching)"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for titles/dates only (default, RECOMMENDED for initial search), "full" for complete details (use only after reviewing index results)'),project:i.string().optional().describe("Filter by project name"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional(),end:i.union([i.string(),i.number()]).optional()}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum results. IMPORTANT: Start with 3-5 to avoid exceeding MCP token limits, even in index mode."),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("date_desc").describe("Sort order")}),handler:async a=>{try{let{filePath:e,format:r="index",...s}=a,t=[],o=[];if(N)try{console.error("[search-server] Using metadata-first + semantic ranking for file search");let d=R.findByFile(e,s);if(console.error(`[search-server] Found ${d.observations.length} observations, ${d.sessions.length} sessions for file "${e}"`),o=d.sessions,d.observations.length>0){let l=d.observations.map(m=>m.id),u=await A(e,Math.min(l.length,100)),p=[];for(let m of u.ids)l.includes(m)&&!p.includes(m)&&p.push(m);console.error(`[search-server] Chroma ranked ${p.length} observations by semantic relevance`),p.length>0&&(t=y.getObservationsByIds(p,{limit:s.limit||20}),t.sort((m,_)=>p.indexOf(m.id)-p.indexOf(_.id)))}}catch(d){console.error("[search-server] Chroma ranking failed, using SQLite order:",d.message)}if(t.length===0&&o.length===0){console.error("[search-server] Using SQLite-only file search");let d=R.findByFile(e,s);t=d.observations,o=d.sessions}let n=t.length+o.length;if(n===0)return{content:[{type:"text",text:`No results found for file "${e}"`}]};let c;if(r==="index"){let d=`Found ${n} result(s) for file "${e}":
`,l=[];t.forEach((u,p)=>{l.push(D(u,p))}),o.forEach((u,p)=>{l.push(P(u,p+t.length))}),c=d+l.join(`
`)+v()}else{let d=[];t.forEach((l,u)=>{d.push(w(l,u))}),o.forEach((l,u)=>{d.push(G(l,u+t.length))}),c=d.join(`
---
`)}return{content:[{type:"text",text:c}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}},{name:"find_by_type",description:'Find observations of a specific type (decision, bugfix, feature, refactor, discovery, change). IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({type:i.union([i.enum(["decision","bugfix","feature","refactor","discovery","change"]),i.array(i.enum(["decision","bugfix","feature","refactor","discovery","change"]))]).describe("Observation type(s) to filter by"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for titles/dates only (default, RECOMMENDED for initial search), "full" for complete details (use only after reviewing index results)'),project:i.string().optional().describe("Filter by project name"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional(),end:i.union([i.string(),i.number()]).optional()}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum results. IMPORTANT: Start with 3-5 to avoid exceeding MCP token limits, even in index mode."),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("date_desc").describe("Sort order")}),handler:async a=>{try{let{type:e,format:r="index",...s}=a,t=Array.isArray(e)?e.join(", "):e,o=[];if(N)try{console.error("[search-server] Using metadata-first + semantic ranking for type search");let c=R.findByType(e,s);if(console.error(`[search-server] Found ${c.length} observations with type "${t}"`),c.length>0){let d=c.map(p=>p.id),l=await A(t,Math.min(d.length,100)),u=[];for(let p of l.ids)d.includes(p)&&!u.includes(p)&&u.push(p);console.error(`[search-server] Chroma ranked ${u.length} results by semantic relevance`),u.length>0&&(o=y.getObservationsByIds(u,{limit:s.limit||20}),o.sort((p,m)=>u.indexOf(p.id)-u.indexOf(m.id)))}}catch(c){console.error("[search-server] Chroma ranking failed, using SQLite order:",c.message)}if(o.length===0&&(console.error("[search-server] Using SQLite-only type search"),o=R.findByType(e,s)),o.length===0)return{content:[{type:"text",text:`No observations found with type "${t}"`}]};let n;if(r==="index"){let c=`Found ${o.length} observation(s) with type "${t}":
`,d=o.map((l,u)=>D(l,u));n=c+d.join(`
`)+v()}else n=o.map((d,l)=>w(d,l)).join(`
---
`);return{content:[{type:"text",text:n}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}},{name:"get_recent_context",description:"Get recent session context including summaries and observations for a project",inputSchema:i.object({project:i.string().optional().describe("Project name (defaults to current working directory basename)"),limit:i.number().min(1).max(10).default(3).describe("Number of recent sessions to retrieve")}),handler:async a=>{try{let e=a.project||ne(process.cwd()),r=a.limit||3,s=y.getRecentSessionsWithStatus(e,r);if(s.length===0)return{content:[{type:"text",text:`# Recent Session Context
No previous sessions found for project "${e}".`}]};let t=[];t.push("# Recent Session Context"),t.push(""),t.push(`Showing last ${s.length} session(s) for **${e}**:`),t.push("");for(let o of s)if(o.sdk_session_id){if(t.push("---"),t.push(""),o.has_summary){let n=y.getSummaryForSession(o.sdk_session_id);if(n){let c=n.prompt_number?` (Prompt #${n.prompt_number})`:"";if(t.push(`**Summary${c}**`),t.push(""),n.request&&t.push(`**Request:** ${n.request}`),n.completed&&t.push(`**Completed:** ${n.completed}`),n.learned&&t.push(`**Learned:** ${n.learned}`),n.next_steps&&t.push(`**Next Steps:** ${n.next_steps}`),n.files_read)try{let l=JSON.parse(n.files_read);Array.isArray(l)&&l.length>0&&t.push(`**Files Read:** ${l.join(", ")}`)}catch{n.files_read.trim()&&t.push(`**Files Read:** ${n.files_read}`)}if(n.files_edited)try{let l=JSON.parse(n.files_edited);Array.isArray(l)&&l.length>0&&t.push(`**Files Edited:** ${l.join(", ")}`)}catch{n.files_edited.trim()&&t.push(`**Files Edited:** ${n.files_edited}`)}let d=new Date(n.created_at).toLocaleString();t.push(`**Date:** ${d}`)}}else if(o.status==="active"){t.push("**In Progress**"),t.push(""),o.user_prompt&&t.push(`**Request:** ${o.user_prompt}`);let n=y.getObservationsForSession(o.sdk_session_id);if(n.length>0){t.push(""),t.push(`**Observations (${n.length}):**`);for(let d of n)t.push(`- ${d.title}`)}else t.push(""),t.push("*No observations yet*");t.push(""),t.push("**Status:** Active - summary pending");let c=new Date(o.started_at).toLocaleString();t.push(`**Date:** ${c}`)}else{t.push(`**${o.status.charAt(0).toUpperCase()+o.status.slice(1)}**`),t.push(""),o.user_prompt&&t.push(`**Request:** ${o.user_prompt}`),t.push(""),t.push(`**Status:** ${o.status} - no summary available`);let n=new Date(o.started_at).toLocaleString();t.push(`**Date:** ${n}`)}t.push("")}return{content:[{type:"text",text:t.join(`
`)}]}}catch(e){return{content:[{type:"text",text:`Failed to get recent context: ${e.message}`}],isError:!0}}}},{name:"search_user_prompts",description:'Search raw user prompts with full-text search. Use this to find what the user actually said/requested across all sessions. IMPORTANT: Always use index format first (default) to get an overview with minimal token usage, then use format: "full" only for specific items of interest.',inputSchema:i.object({query:i.string().describe("Search query for FTS5 full-text search"),format:i.enum(["index","full"]).default("index").describe('Output format: "index" for truncated prompts/dates (default, RECOMMENDED for initial search), "full" for complete prompt text (use only after reviewing index results)'),project:i.string().optional().describe("Filter by project name"),dateRange:i.object({start:i.union([i.string(),i.number()]).optional(),end:i.union([i.string(),i.number()]).optional()}).optional().describe("Filter by date range"),limit:i.number().min(1).max(100).default(20).describe("Maximum number of results"),offset:i.number().min(0).default(0).describe("Number of results to skip"),orderBy:i.enum(["relevance","date_desc","date_asc"]).default("relevance").describe("Sort order")}),handler:async a=>{try{let{query:e,format:r="index",...s}=a,t=R.searchUserPrompts(e,s);if(t.length===0)return{content:[{type:"text",text:`No user prompts found matching "${e}"`}]};let o;if(r==="index"){let n=`Found ${t.length} user prompt(s) matching "${e}":
`,c=t.map((d,l)=>ie(d,l));o=n+c.join(`
`)+v()}else o=t.map((c,d)=>ae(c,d)).join(`
---
`);return{content:[{type:"text",text:o}]}}catch(e){return{content:[{type:"text",text:`Search failed: ${e.message}`}],isError:!0}}}}],M=new Q({name:"claude-mem-search",version:"1.0.0"},{capabilities:{tools:{}}});M.setRequestHandler(te,async()=>({tools:H.map(a=>({name:a.name,description:a.description,inputSchema:re(a.inputSchema)}))}));M.setRequestHandler(se,async a=>{let e=H.find(r=>r.name===a.params.name);if(!e)throw new Error(`Unknown tool: ${a.params.name}`);try{return await e.handler(a.params.arguments||{})}catch(r){return{content:[{type:"text",text:`Tool execution failed: ${r.message}`}],isError:!0}}});async function de(){let a=new z;await M.connect(a),console.error("[search-server] Claude-mem search server started"),setTimeout(async()=>{try{console.error("[search-server] Initializing Chroma client...");let e=new ee({command:"uvx",args:["chroma-mcp","--client-type","persistent","--data-dir",B]}),r=new Z({name:"claude-mem-search-chroma-client",version:"1.0.0"},{capabilities:{}});await r.connect(e),N=r,console.error("[search-server] Chroma client connected successfully")}catch(e){console.error("[search-server] Failed to initialize Chroma client:",e.message),console.error("[search-server] Falling back to FTS5-only search"),N=null}},0)}de().catch(a=>{console.error("[search-server] Fatal error:",a),process.exit(1)});
+5 -9
View File
@@ -110,23 +110,19 @@ async function buildHooks() {
await build({
entryPoints: [SEARCH_SERVER.source],
bundle: true,
platform: 'node',
target: 'node18',
format: 'esm',
outfile: `${hooksDir}/${SEARCH_SERVER.name}.js`,
platform: 'node',
outfile: `${hooksDir}/${SEARCH_SERVER.name}.mjs`,
minify: true,
external: ['better-sqlite3'],
define: {
'__DEFAULT_PACKAGE_VERSION__': `"${version}"`
},
packages: 'external',
banner: {
js: '#!/usr/bin/env node'
}
});
// Make search server executable
fs.chmodSync(`${hooksDir}/${SEARCH_SERVER.name}.js`, 0o755);
const searchStats = fs.statSync(`${hooksDir}/${SEARCH_SERVER.name}.js`);
fs.chmodSync(`${hooksDir}/${SEARCH_SERVER.name}.mjs`, 0o755);
const searchStats = fs.statSync(`${hooksDir}/${SEARCH_SERVER.name}.mjs`);
console.log(`✓ search-server built (${(searchStats.size / 1024).toFixed(2)} KB)`);
console.log('\n✅ All hooks, worker service, and search server built successfully!');
+46 -77
View File
@@ -35,7 +35,6 @@ try {
/**
* Query Chroma vector database via MCP
* Parses Python dict-like responses from Chroma MCP server
*/
async function queryChroma(
query: string,
@@ -59,65 +58,31 @@ async function queryChroma(
const resultText = result.content[0]?.text || '';
// Parse Python dict-like output using regex
// Format: {'ids': [[...]], 'distances': [[...]], 'metadatas': [[...]]}
// Parse JSON response
let parsed: any;
try {
parsed = JSON.parse(resultText);
} catch (error) {
console.error('[search-server] Failed to parse Chroma response as JSON:', error);
return { ids: [], distances: [], metadatas: [] };
}
// Extract IDs (nested array format)
const idsMatch = resultText.match(/'ids':\s*\[\[(.*?)\]\]/s);
// Extract unique observation IDs from document IDs
const ids: number[] = [];
if (idsMatch) {
const idsContent = idsMatch[1];
// Match quoted strings (Chroma doc IDs like 'obs_123_title')
const idMatches = idsContent.match(/'([^']*(?:\\'[^']*)*)'/g) || [];
for (const idMatch of idMatches) {
const docId = idMatch.slice(1, -1);
// Extract sqlite_id from document ID (format: obs_{id}_title)
const sqliteIdMatch = docId.match(/obs_(\d+)_/);
if (sqliteIdMatch) {
const sqliteId = parseInt(sqliteIdMatch[1], 10);
if (!ids.includes(sqliteId)) {
ids.push(sqliteId);
}
const docIds = parsed.ids?.[0] || [];
for (const docId of docIds) {
// Extract sqlite_id from document ID (format: obs_{id}_narrative, obs_{id}_fact_0, etc)
const match = docId.match(/obs_(\d+)_/);
if (match) {
const sqliteId = parseInt(match[1], 10);
if (!ids.includes(sqliteId)) {
ids.push(sqliteId);
}
}
}
// Extract distances (nested array format)
const distancesMatch = resultText.match(/'distances':\s*\[\[([\d.,\s]+)\]\]/s);
const distances: number[] = [];
if (distancesMatch) {
const distancesContent = distancesMatch[1];
const distanceValues = distancesContent.split(',').map(d => parseFloat(d.trim())).filter(d => !isNaN(d));
distances.push(...distanceValues);
}
// Extract metadatas (nested array format)
const metasMatch = resultText.match(/'metadatas':\s*\[\[(.*?)\]\]/s);
const metadatas: any[] = [];
if (metasMatch) {
const metasContent = metasMatch[1];
// Parse each metadata dict
const metaObjMatches = metasContent.match(/\{[^}]+\}/g) || [];
for (const metaStr of metaObjMatches) {
const meta: any = {};
// Extract sqlite_id
const sqliteIdMatch = metaStr.match(/'sqlite_id':\s*(\d+)/);
if (sqliteIdMatch) {
meta.sqlite_id = parseInt(sqliteIdMatch[1], 10);
}
// Extract type
const typeMatch = metaStr.match(/'type':\s*'([^']+)'/);
if (typeMatch) {
meta.type = typeMatch[1];
}
// Extract created_at_epoch
const epochMatch = metaStr.match(/'created_at_epoch':\s*(\d+)/);
if (epochMatch) {
meta.created_at_epoch = parseInt(epochMatch[1], 10);
}
metadatas.push(meta);
}
}
const distances = parsed.distances?.[0] || [];
const metadatas = parsed.metadatas?.[0] || [];
return { ids, distances, metadatas };
}
@@ -1095,32 +1060,36 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
// Start the server
async function main() {
// Initialize Chroma client
try {
console.error('[search-server] Initializing Chroma client...');
const chromaTransport = new StdioClientTransport({
command: 'uvx',
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
});
chromaClient = new Client({
name: 'claude-mem-search-chroma-client',
version: '1.0.0'
}, {
capabilities: {}
});
await chromaClient.connect(chromaTransport);
console.error('[search-server] Chroma client connected successfully');
} catch (error: any) {
console.error('[search-server] Failed to initialize Chroma client:', error.message);
console.error('[search-server] Falling back to FTS5-only search');
chromaClient = null;
}
// Start the MCP server FIRST (critical - must start before blocking operations)
const transport = new StdioServerTransport();
await server.connect(transport);
console.error('[search-server] Claude-mem search server started');
// Initialize Chroma client in background (non-blocking)
setTimeout(async () => {
try {
console.error('[search-server] Initializing Chroma client...');
const chromaTransport = new StdioClientTransport({
command: 'uvx',
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
});
const client = new Client({
name: 'claude-mem-search-chroma-client',
version: '1.0.0'
}, {
capabilities: {}
});
await client.connect(chromaTransport);
chromaClient = client;
console.error('[search-server] Chroma client connected successfully');
} catch (error: any) {
console.error('[search-server] Failed to initialize Chroma client:', error.message);
console.error('[search-server] Falling back to FTS5-only search');
chromaClient = null;
}
}, 0);
}
main().catch((error) => {