Implement hybrid search: Chroma semantic + SQLite temporal
Core implementation: - Added Chroma MCP client integration to search-server.ts - Implemented queryChroma() helper with Python dict parsing - Added VECTOR_DB_DIR constant to paths.ts - Added SessionStore.getObservationsByIds() method Search handlers updated: - search_observations: Semantic-first with 90-day temporal filter - find_by_concept/type/file: Metadata-first, semantic-enhanced ranking - All handlers fall back to FTS5 if Chroma unavailable Technical details: - Direct MCP client usage (no abstractions) - Regex parsing of Chroma Python dict responses - Semantic ranking preserved in final results - Graceful degradation to FTS5-only search 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,486 @@
|
||||
# Feature Implementation Plan: Hybrid Search (Chroma + SQLite)
|
||||
|
||||
## Status: Experimental validation complete, ready for production implementation
|
||||
|
||||
## Experiment Results Summary
|
||||
|
||||
**Branch:** `experiment/chroma-mcp`
|
||||
**Validation:** Semantic search (Chroma) + Temporal filtering (SQLite) working correctly
|
||||
**Collection:** `cm__claude-mem` with 2,800+ documents synced
|
||||
**Decision:** Proceed with production implementation
|
||||
|
||||
---
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Phase 1: Clean Start
|
||||
|
||||
#### 1.1 Create Feature Branch
|
||||
```bash
|
||||
# Start from clean main branch
|
||||
git checkout main
|
||||
git pull origin main
|
||||
|
||||
# Create new feature branch
|
||||
git branch feature/hybrid-search
|
||||
git checkout feature/hybrid-search
|
||||
```
|
||||
|
||||
#### 1.2 Port Working Experiment Scripts
|
||||
|
||||
**Files to keep (these work correctly):**
|
||||
- `experiment/chroma-sync-experiment.ts` - Syncs SQLite → Chroma
|
||||
- `experiment/chroma-search-test.ts` - Validates search quality
|
||||
- `experiment/README.md` - Experiment documentation
|
||||
- `experiment/RESULTS.md` - Update with accurate current results
|
||||
|
||||
**Actions:**
|
||||
```bash
|
||||
# Cherry-pick only the experiment files from experiment/chroma-mcp
|
||||
git checkout experiment/chroma-mcp -- experiment/
|
||||
|
||||
# Remove any experiment artifacts that reference old implementation
|
||||
# (test-chroma-connection.ts uses broken ChromaOrchestrator)
|
||||
git rm experiment/../test-chroma-connection.ts 2>/dev/null || true
|
||||
|
||||
# Commit clean experiment baseline
|
||||
git commit -m "Add validated Chroma search experiments"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Production Architecture
|
||||
|
||||
#### 2.1 Design Principles
|
||||
|
||||
**Core Rules:**
|
||||
1. ✅ Direct MCP client usage (no wrapper abstractions)
|
||||
2. ✅ Inline helper functions (no ChromaOrchestrator)
|
||||
3. ✅ Each search workflow is deterministic (no fallbacks)
|
||||
4. ✅ Temporal boundaries prevent stale results
|
||||
5. ✅ Chroma handles semantic ranking, SQLite handles recency
|
||||
|
||||
**File Structure:**
|
||||
```
|
||||
src/
|
||||
├── servers/
|
||||
│ └── search-server.ts # Hybrid MCP server (SQLite + Chroma)
|
||||
├── services/
|
||||
│ ├── sqlite/
|
||||
│ │ ├── SessionStore.ts # SQLite CRUD (unchanged)
|
||||
│ │ └── SessionSearch.ts # FTS5 search (fallback if Chroma fails)
|
||||
│ └── sync/
|
||||
│ └── ChromaSync.ts # NEW: Sync SQLite → Chroma on observation save
|
||||
└── shared/
|
||||
└── paths.ts # Add VECTOR_DB_DIR constant
|
||||
```
|
||||
|
||||
#### 2.2 Search Workflows
|
||||
|
||||
**Workflow 1: search_observations (Semantic-First, Temporally-Bounded)**
|
||||
```
|
||||
User Query → Chroma Semantic Search (top 100)
|
||||
→ Filter: created_at_epoch > (now - 90 days)
|
||||
→ SQLite: Hydrate full records
|
||||
→ Sort: created_at_epoch DESC
|
||||
→ Return: Recent + semantically relevant
|
||||
```
|
||||
|
||||
**Workflow 2: find_by_concept/type/file (Metadata-First, Semantic-Enhanced)**
|
||||
```
|
||||
User Query → SQLite: Filter by metadata (type/concept/file)
|
||||
→ Chroma: Rank filtered IDs by semantic relevance
|
||||
→ SQLite: Hydrate in semantic rank order
|
||||
→ Return: Metadata-filtered + semantically ranked
|
||||
```
|
||||
|
||||
**Workflow 3: search_sessions (SQLite FTS5 only)**
|
||||
```
|
||||
User Query → SQLite FTS5 search (sessions are already summarized)
|
||||
→ Return: Keyword matches
|
||||
```
|
||||
|
||||
**Workflow 4: get_recent_context (Temporal-First, No Semantic)**
|
||||
```
|
||||
Hook Request → SQLite: Last 50 observations ORDER BY created_at_epoch DESC
|
||||
→ Return: Most recent context (no semantic ranking needed)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Implementation Steps
|
||||
|
||||
#### 3.1 Add Chroma Support to search-server.ts
|
||||
|
||||
**File:** `src/servers/search-server.ts`
|
||||
|
||||
**Changes:**
|
||||
1. Add Chroma MCP client initialization (lines 20-26):
|
||||
```typescript
|
||||
let chromaClient: Client;
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
```
|
||||
|
||||
2. Add `queryChroma()` helper function with proper Python dict parsing:
|
||||
```typescript
|
||||
async function queryChroma(
|
||||
query: string,
|
||||
limit: number,
|
||||
whereFilter?: Record<string, any>
|
||||
): Promise<{ ids: number[]; distances: number[]; metadatas: any[] }>
|
||||
```
|
||||
|
||||
3. Initialize Chroma client in `main()`:
|
||||
```typescript
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
chromaClient = new Client({...});
|
||||
await chromaClient.connect(chromaTransport);
|
||||
```
|
||||
|
||||
4. Update `search_observations` handler (lines 350-427):
|
||||
- Replace FTS5 search with Chroma semantic search
|
||||
- Add 90-day temporal filter
|
||||
- Hydrate from SQLite in temporal order
|
||||
|
||||
5. Update `find_by_concept` handler (lines 501-575):
|
||||
- SQLite metadata filter first
|
||||
- Chroma semantic ranking second
|
||||
- Preserve semantic rank order in final results
|
||||
|
||||
6. Update `find_by_type` handler (lines 720-797):
|
||||
- Same pattern as find_by_concept
|
||||
|
||||
7. Update `find_by_file` handler (lines 592-700):
|
||||
- Same pattern as find_by_concept
|
||||
|
||||
**IMPORTANT:**
|
||||
- Keep `SessionSearch` as fallback (if Chroma client fails to connect)
|
||||
- Add error handling: if Chroma query fails, fall back to FTS5
|
||||
- Log all Chroma operations to stderr for debugging
|
||||
|
||||
#### 3.2 Add VECTOR_DB_DIR Path Constant
|
||||
|
||||
**File:** `src/shared/paths.ts`
|
||||
|
||||
```typescript
|
||||
export const VECTOR_DB_DIR = path.join(DATA_DIR, 'vector-db');
|
||||
```
|
||||
|
||||
#### 3.3 Add Automatic Sync Service
|
||||
|
||||
**NEW File:** `src/services/sync/ChromaSync.ts`
|
||||
|
||||
**Purpose:** Automatically sync new observations to Chroma when worker saves them
|
||||
|
||||
**Key Methods:**
|
||||
```typescript
|
||||
class ChromaSync {
|
||||
async syncObservation(obs: Observation): Promise<void>
|
||||
async syncBatch(observations: Observation[]): Promise<void>
|
||||
async ensureCollection(): Promise<void>
|
||||
}
|
||||
```
|
||||
|
||||
**Integration Point:**
|
||||
- `worker-service.ts` - After saving observation to SQLite, call `chromaSync.syncObservation()`
|
||||
- Batch sync on startup: sync any observations not yet in Chroma
|
||||
|
||||
**Document Format (per experiment):**
|
||||
```typescript
|
||||
// Each observation creates multiple Chroma documents (one per semantic chunk)
|
||||
id: `obs_${obs.id}_title`
|
||||
document: obs.title
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
|
||||
id: `obs_${obs.id}_narrative`
|
||||
document: obs.narrative
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
|
||||
// Facts become individual searchable chunks
|
||||
id: `obs_${obs.id}_fact_${i}`
|
||||
document: fact
|
||||
metadata: { sqlite_id: obs.id, type: obs.type, created_at_epoch: obs.created_at_epoch }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Build and Validation
|
||||
|
||||
#### 4.1 Build Process
|
||||
```bash
|
||||
# Build all scripts
|
||||
npm run build
|
||||
|
||||
# Verify outputs
|
||||
ls -lh plugin/scripts/search-server.js # Should exist (ESM)
|
||||
ls -lh plugin/scripts/search-server.cjs # Should NOT exist (delete if present)
|
||||
|
||||
# Check build format
|
||||
head -1 plugin/scripts/search-server.js # Should show: #!/usr/bin/env node
|
||||
```
|
||||
|
||||
#### 4.2 Validation Checklist
|
||||
|
||||
**✅ Pre-deployment checks:**
|
||||
1. Run sync experiment: `npx tsx experiment/chroma-sync-experiment.ts`
|
||||
- Verify collection created
|
||||
- Verify documents synced
|
||||
- Check document count matches observations
|
||||
|
||||
2. Run search test: `npx tsx experiment/chroma-search-test.ts`
|
||||
- Verify semantic queries return results
|
||||
- Compare quality vs FTS5
|
||||
- Document results in RESULTS.md
|
||||
|
||||
3. Test MCP server standalone:
|
||||
```bash
|
||||
# Start server manually
|
||||
node plugin/scripts/search-server.js
|
||||
|
||||
# In another terminal, test with MCP inspector
|
||||
npx @modelcontextprotocol/inspector node plugin/scripts/search-server.js
|
||||
```
|
||||
|
||||
4. Test with Claude Code:
|
||||
```bash
|
||||
# Deploy to plugin directory
|
||||
cp -r plugin/* ~/.claude/plugins/marketplaces/thedotmack/
|
||||
|
||||
# Restart worker
|
||||
pm2 restart claude-mem-worker
|
||||
|
||||
# Start new Claude session and test search tools
|
||||
```
|
||||
|
||||
**✅ Smoke tests:**
|
||||
- Search for recent work: Should return last 90 days
|
||||
- Search for old concepts: Should filter by recency
|
||||
- Search by file: Should return file-specific observations
|
||||
- Search by type: Should return only that type
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Documentation
|
||||
|
||||
#### 5.1 Update CLAUDE.md
|
||||
|
||||
Add to "What It Does" section:
|
||||
```markdown
|
||||
### Hybrid Search Architecture
|
||||
|
||||
Claude-mem uses a hybrid search system combining:
|
||||
- **Semantic Search (Chroma)**: Vector embeddings for conceptual understanding
|
||||
- **Keyword Search (SQLite FTS5)**: Full-text search for exact matches
|
||||
- **Temporal Filtering**: 90-day recency boundary prevents stale results
|
||||
|
||||
Search workflows automatically choose the optimal combination:
|
||||
- Conceptual queries → Semantic-first, temporally-bounded
|
||||
- Metadata queries → Metadata-first, semantically-enhanced
|
||||
- Recent context → Temporal-first (no semantic ranking)
|
||||
```
|
||||
|
||||
#### 5.2 Update Architecture Section
|
||||
|
||||
```markdown
|
||||
### Vector Database Layer
|
||||
|
||||
**Technology**: ChromaDB via Chroma MCP server
|
||||
**Location**: `~/.claude-mem/vector-db/`
|
||||
**Collection**: `cm__claude-mem`
|
||||
|
||||
**Sync Strategy**:
|
||||
- Worker service syncs observations to Chroma after SQLite save
|
||||
- Each observation creates multiple vector documents (title, narrative, facts)
|
||||
- Metadata includes `sqlite_id` for cross-reference
|
||||
|
||||
**Search Strategy**:
|
||||
- Semantic queries use Chroma with 90-day temporal filter
|
||||
- Metadata queries filter SQLite first, then semantic rank
|
||||
- Fallback to FTS5 if Chroma unavailable
|
||||
```
|
||||
|
||||
#### 5.3 Write Release Notes
|
||||
|
||||
**File:** `EXPERIMENTAL_RELEASE_NOTES.md`
|
||||
|
||||
```markdown
|
||||
# Hybrid Search Release (v4.4.0)
|
||||
|
||||
## Breaking Changes
|
||||
None - Search MCP tools maintain same interface
|
||||
|
||||
## New Features
|
||||
|
||||
### Semantic Search via Chroma
|
||||
- Added ChromaDB integration for vector-based semantic search
|
||||
- Observations automatically synced to vector database
|
||||
- Search understands conceptual queries (not just keywords)
|
||||
|
||||
### Hybrid Search Workflows
|
||||
- `search_observations`: Semantic search with 90-day recency filter
|
||||
- `find_by_concept/type/file`: Metadata filtering + semantic ranking
|
||||
- Automatic fallback to FTS5 if Chroma unavailable
|
||||
|
||||
### Sync Automation
|
||||
- Worker service auto-syncs new observations to Chroma
|
||||
- Batch sync on startup for any missing observations
|
||||
- Collection: `cm__claude-mem` in `~/.claude-mem/vector-db/`
|
||||
|
||||
## Technical Details
|
||||
|
||||
**New Dependencies:**
|
||||
- `@modelcontextprotocol/sdk` (already present)
|
||||
- External: `uvx chroma-mcp` (Python package via uvx)
|
||||
|
||||
**New Files:**
|
||||
- `src/services/sync/ChromaSync.ts` - Auto-sync service
|
||||
- `experiment/chroma-sync-experiment.ts` - Manual sync tool
|
||||
- `experiment/chroma-search-test.ts` - Search quality validator
|
||||
|
||||
**Modified Files:**
|
||||
- `src/servers/search-server.ts` - Hybrid search implementation
|
||||
- `src/services/worker-service.ts` - Auto-sync integration
|
||||
- `src/shared/paths.ts` - Added VECTOR_DB_DIR constant
|
||||
|
||||
**Design Rationale:**
|
||||
- Temporal boundaries prevent old semantically-perfect matches from outranking recent updates
|
||||
- Metadata-first filtering eliminates irrelevant categories before semantic ranking
|
||||
- Direct MCP client usage avoids abstraction overhead
|
||||
- Inline helpers keep parsing logic close to usage
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Phase 6: Deployment
|
||||
|
||||
#### 6.1 Pre-merge Validation
|
||||
```bash
|
||||
# Ensure all tests pass
|
||||
npm run build
|
||||
npm run test:parser # If applicable
|
||||
|
||||
# Validate experiment results
|
||||
npx tsx experiment/chroma-sync-experiment.ts
|
||||
npx tsx experiment/chroma-search-test.ts
|
||||
|
||||
# Test production MCP server
|
||||
node plugin/scripts/search-server.js &
|
||||
# Send test queries via MCP inspector
|
||||
|
||||
# Clean build artifacts
|
||||
rm -f plugin/scripts/*.cjs # Remove stale CommonJS builds
|
||||
```
|
||||
|
||||
#### 6.2 Commit Strategy
|
||||
```bash
|
||||
# Commit 1: Experiment scripts (already done if following plan)
|
||||
git add experiment/
|
||||
git commit -m "Add validated Chroma search experiments"
|
||||
|
||||
# Commit 2: Core implementation
|
||||
git add src/servers/search-server.ts src/shared/paths.ts
|
||||
git commit -m "Implement hybrid search: Chroma semantic + SQLite temporal"
|
||||
|
||||
# Commit 3: Auto-sync service
|
||||
git add src/services/sync/ src/services/worker-service.ts
|
||||
git commit -m "Add automatic observation sync to Chroma vector DB"
|
||||
|
||||
# Commit 4: Documentation
|
||||
git add CLAUDE.md EXPERIMENTAL_RELEASE_NOTES.md
|
||||
git commit -m "Document hybrid search architecture and usage"
|
||||
|
||||
# Commit 5: Build artifacts
|
||||
npm run build
|
||||
git add plugin/scripts/
|
||||
git commit -m "Build hybrid search implementation"
|
||||
```
|
||||
|
||||
#### 6.3 Merge to Main
|
||||
```bash
|
||||
# Push feature branch
|
||||
git push origin feature/hybrid-search
|
||||
|
||||
# Create PR or merge directly (your choice)
|
||||
git checkout main
|
||||
git merge feature/hybrid-search
|
||||
git push origin main
|
||||
|
||||
# Tag release
|
||||
git tag v4.4.0
|
||||
git push origin v4.4.0
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Rollback Plan
|
||||
|
||||
If issues arise post-deployment:
|
||||
|
||||
```bash
|
||||
# Quick rollback
|
||||
git checkout main
|
||||
git revert HEAD~5..HEAD # Revert last 5 commits
|
||||
git push origin main
|
||||
|
||||
# Or cherry-pick the revert
|
||||
git checkout -b hotfix/rollback-hybrid-search
|
||||
git revert <commit-sha>
|
||||
git push origin hotfix/rollback-hybrid-search
|
||||
```
|
||||
|
||||
**Chroma data cleanup (if needed):**
|
||||
```bash
|
||||
# Remove vector database
|
||||
rm -rf ~/.claude-mem/vector-db/
|
||||
|
||||
# Search server will fall back to FTS5 if Chroma unavailable
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
**Must have before merge:**
|
||||
- ✅ Sync experiment completes without errors
|
||||
- ✅ Search test shows Chroma returning results
|
||||
- ✅ MCP server starts and responds to queries
|
||||
- ✅ Fallback to FTS5 works if Chroma unavailable
|
||||
- ✅ No breaking changes to existing MCP tool interfaces
|
||||
- ✅ Documentation updated
|
||||
- ✅ No uncommitted changes
|
||||
- ✅ No dead code (ChromaOrchestrator removed)
|
||||
- ✅ No stale build artifacts (.cjs files)
|
||||
|
||||
**Nice to have:**
|
||||
- Performance benchmarks (Chroma vs FTS5 query time)
|
||||
- Search quality metrics (relevance scores)
|
||||
- Token usage comparison (semantic vs keyword results)
|
||||
|
||||
---
|
||||
|
||||
## Timeline Estimate
|
||||
|
||||
- Phase 1 (Clean Start): 15 minutes
|
||||
- Phase 2 (Architecture Review): 30 minutes
|
||||
- Phase 3 (Implementation): 2-3 hours
|
||||
- Phase 4 (Validation): 1 hour
|
||||
- Phase 5 (Documentation): 1 hour
|
||||
- Phase 6 (Deployment): 30 minutes
|
||||
|
||||
**Total: ~5-6 hours** for complete, validated implementation
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- The experiment validated that semantic search works and provides value
|
||||
- This plan avoids all the mistakes from the previous attempt:
|
||||
- ✅ Clean branch from main (no baggage)
|
||||
- ✅ Implementation AFTER experiment validation
|
||||
- ✅ No dead code (ChromaOrchestrator)
|
||||
- ✅ Proper commit strategy
|
||||
- ✅ Complete documentation
|
||||
- ✅ Validation at every step
|
||||
@@ -0,0 +1,193 @@
|
||||
# Prompt for Next Session: Hybrid Search Implementation
|
||||
|
||||
Copy this entire prompt into a new Claude Code session to continue the hybrid search feature implementation.
|
||||
|
||||
---
|
||||
|
||||
## Context
|
||||
|
||||
I'm working on the `claude-mem` project (persistent memory system for Claude Code). I have an experimental branch `experiment/chroma-mcp` that attempted to add semantic search via ChromaDB, but it has implementation issues and was done in the wrong order.
|
||||
|
||||
**Current Status:**
|
||||
- ✅ Experiment validated: Semantic search (Chroma) + temporal filtering (SQLite) works
|
||||
- ✅ Chroma collection `cm__claude-mem` has 2,800+ documents synced
|
||||
- ✅ Search quality tests show semantic search provides value
|
||||
- ❌ Production implementation has issues (dead code, uncommitted fixes, wrong process)
|
||||
- ✅ Feature plan written and ready to execute
|
||||
|
||||
**Your Task:**
|
||||
Follow the feature implementation plan in `FEATURE_PLAN_HYBRID_SEARCH.md` to implement hybrid search correctly from the ground up.
|
||||
|
||||
---
|
||||
|
||||
## Immediate Actions
|
||||
|
||||
1. **Read the feature plan:**
|
||||
```
|
||||
Read: /Users/alexnewman/Scripts/claude-mem/FEATURE_PLAN_HYBRID_SEARCH.md
|
||||
```
|
||||
|
||||
2. **Understand the experiment results:**
|
||||
- The experiment scripts work correctly
|
||||
- Chroma semantic search is functional
|
||||
- We just need to implement it properly in production
|
||||
|
||||
3. **Execute Phase 1 of the plan:**
|
||||
- Create new `feature/hybrid-search` branch from `main`
|
||||
- Port working experiment scripts from `experiment/chroma-mcp`
|
||||
- Clean up any dead code references
|
||||
|
||||
---
|
||||
|
||||
## Key Principles for This Implementation
|
||||
|
||||
1. **Start clean:** New branch from `main`, no baggage from failed attempt
|
||||
2. **No abstractions:** Direct MCP client usage, no ChromaOrchestrator wrapper
|
||||
3. **Validate at each step:** Don't commit until you've tested it works
|
||||
4. **Proper parsing:** Chroma MCP returns Python dicts, not JSON - use regex parsing
|
||||
5. **Temporal boundaries:** 90-day filter prevents stale semantic matches
|
||||
|
||||
---
|
||||
|
||||
## Files You'll Need to Work With
|
||||
|
||||
**Core Implementation:**
|
||||
- `src/servers/search-server.ts` - Add hybrid search workflows
|
||||
- `src/services/sync/ChromaSync.ts` - NEW: Auto-sync observations to Chroma
|
||||
- `src/services/worker-service.ts` - Integrate auto-sync
|
||||
- `src/shared/paths.ts` - Add VECTOR_DB_DIR constant
|
||||
|
||||
**Experiment Files (keep these, they work):**
|
||||
- `experiment/chroma-sync-experiment.ts` - Manual sync tool
|
||||
- `experiment/chroma-search-test.ts` - Search quality validator
|
||||
|
||||
**Files to DELETE (dead code from failed attempt):**
|
||||
- `src/services/chroma/ChromaOrchestrator.ts` - Broken wrapper, never used
|
||||
- `test-chroma-connection.ts` - Uses broken ChromaOrchestrator
|
||||
- `plugin/scripts/search-server.cjs` - Stale CommonJS build
|
||||
|
||||
---
|
||||
|
||||
## Validation Checklist
|
||||
|
||||
Before committing any code, verify:
|
||||
|
||||
```bash
|
||||
# 1. Build succeeds
|
||||
npm run build
|
||||
|
||||
# 2. Sync works
|
||||
npx tsx experiment/chroma-sync-experiment.ts
|
||||
|
||||
# 3. Search works
|
||||
npx tsx experiment/chroma-search-test.ts
|
||||
|
||||
# 4. MCP server starts
|
||||
node plugin/scripts/search-server.js
|
||||
# (Ctrl+C to stop)
|
||||
|
||||
# 5. No dead code
|
||||
grep -r "ChromaOrchestrator" src/ # Should return nothing
|
||||
|
||||
# 6. No stale builds
|
||||
ls plugin/scripts/search-server.cjs # Should not exist
|
||||
|
||||
# 7. Git status clean
|
||||
git status # No uncommitted changes to production files
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Workflow (from Phase 3 of plan)
|
||||
|
||||
### Step 1: Add queryChroma Helper
|
||||
In `src/servers/search-server.ts`, add a helper function that:
|
||||
- Takes: `query: string, limit: number, whereFilter?: object`
|
||||
- Calls: `chromaClient.callTool({ name: 'chroma_query_documents', ... })`
|
||||
- Parses: Python dict response with regex (see lines 256-318 in current branch for example)
|
||||
- Returns: `{ ids: number[], distances: number[], metadatas: any[] }`
|
||||
|
||||
### Step 2: Initialize Chroma Client
|
||||
In `main()` function:
|
||||
```typescript
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
chromaClient = new Client({ name: 'claude-mem-search-chroma-client', version: '1.0.0' }, { capabilities: {} });
|
||||
await chromaClient.connect(chromaTransport);
|
||||
```
|
||||
|
||||
### Step 3: Update search_observations Handler
|
||||
Replace FTS5 keyword search with:
|
||||
1. Chroma semantic search (top 100)
|
||||
2. Filter by recency (90 days)
|
||||
3. Hydrate from SQLite in temporal order
|
||||
4. Return results
|
||||
|
||||
### Step 4: Update Metadata Search Handlers
|
||||
For `find_by_concept`, `find_by_type`, `find_by_file`:
|
||||
1. SQLite metadata filter first
|
||||
2. Chroma semantic ranking second
|
||||
3. Preserve semantic rank order in results
|
||||
|
||||
---
|
||||
|
||||
## Expected Timeline
|
||||
|
||||
- Phase 1 (Clean Start): 15 minutes
|
||||
- Phase 2 (Architecture Review): Already done, read the plan
|
||||
- Phase 3 (Implementation): 2-3 hours
|
||||
- Phase 4 (Validation): 1 hour
|
||||
- Phase 5 (Documentation): 1 hour
|
||||
- Phase 6 (Deployment): 30 minutes
|
||||
|
||||
**Total: ~5-6 hours**
|
||||
|
||||
---
|
||||
|
||||
## Questions to Ask Me
|
||||
|
||||
If you encounter any issues:
|
||||
|
||||
1. "The Chroma MCP client isn't connecting" → Check if `uvx chroma-mcp` is available
|
||||
2. "Parsing errors from Chroma responses" → Show me the response format, I'll help fix regex
|
||||
3. "Not sure about the search workflow logic" → Reference Phase 2.2 in the plan
|
||||
4. "Should I commit now?" → Only if validation checklist passes
|
||||
5. "Merge to main or PR?" → I'll decide, just get to Phase 6 first
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
Don't merge until ALL of these are true:
|
||||
|
||||
- ✅ Sync experiment completes without errors
|
||||
- ✅ Search test shows Chroma returning relevant results
|
||||
- ✅ MCP server starts and responds to queries
|
||||
- ✅ Fallback to FTS5 works if Chroma unavailable
|
||||
- ✅ No breaking changes to MCP tool interfaces
|
||||
- ✅ Documentation updated (CLAUDE.md + release notes)
|
||||
- ✅ No uncommitted changes in git status
|
||||
- ✅ No dead code (ChromaOrchestrator removed)
|
||||
- ✅ No stale build artifacts (.cjs files deleted)
|
||||
|
||||
---
|
||||
|
||||
## Start Here
|
||||
|
||||
```
|
||||
1. Read the feature plan:
|
||||
Read: /Users/alexnewman/Scripts/claude-mem/FEATURE_PLAN_HYBRID_SEARCH.md
|
||||
|
||||
2. Create the feature branch:
|
||||
Bash: git checkout main && git pull && git checkout -b feature/hybrid-search
|
||||
|
||||
3. Begin Phase 1 of the plan (porting experiment scripts)
|
||||
|
||||
4. Work through each phase systematically, validating at each step
|
||||
|
||||
5. Ask me questions if anything is unclear
|
||||
```
|
||||
|
||||
Let's build this correctly, from the ground up. Take your time and validate at each step.
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as N}from"process";import F from"better-sqlite3";import{join as p,dirname as x,basename as Y}from"path";import{homedir as h}from"os";import{existsSync as Q,mkdirSync as U}from"fs";import{fileURLToPath as w}from"url";function X(){return typeof __dirname<"u"?__dirname:x(w(import.meta.url))}var M=X(),c=process.env.CLAUDE_MEM_DATA_DIR||p(h(),".claude-mem"),u=process.env.CLAUDE_CONFIG_DIR||p(h(),".claude"),Z=p(c,"archives"),ee=p(c,"logs"),se=p(c,"trash"),te=p(c,"backups"),re=p(c,"settings.json"),I=p(c,"claude-mem.db"),ne=p(u,"settings.json"),oe=p(u,"commands"),ie=p(u,"CLAUDE.md");function O(o){U(o,{recursive:!0})}function L(){return p(M,"..","..")}var l=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(l||{}),T=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=l[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=l[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let _="";n!=null&&(this.level===0&&typeof n=="object"?_=`
|
||||
`+JSON.stringify(n,null,2):_=" "+this.formatData(n));let b="";if(r){let{sessionId:H,sdkSessionId:B,correlationId:j,...f}=r;Object.keys(f).length>0&&(b=` {${Object.entries(f).map(([D,y])=>`${D}=${y}`).join(", ")}}`)}let R=`[${i}] [${a}] [${d}] ${E}${t}${b}${_}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new T;var m=class{db;constructor(){O(c),this.db=new F(I),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as N}from"process";import F from"better-sqlite3";import{join as c,dirname as x,basename as Y}from"path";import{homedir as h}from"os";import{existsSync as Q,mkdirSync as U}from"fs";import{fileURLToPath as w}from"url";function M(){return typeof __dirname<"u"?__dirname:x(w(import.meta.url))}var X=M(),p=process.env.CLAUDE_MEM_DATA_DIR||c(h(),".claude-mem"),u=process.env.CLAUDE_CONFIG_DIR||c(h(),".claude"),Z=c(p,"archives"),ee=c(p,"logs"),se=c(p,"trash"),te=c(p,"backups"),re=c(p,"settings.json"),I=c(p,"claude-mem.db"),ne=c(p,"vector-db"),oe=c(u,"settings.json"),ie=c(u,"commands"),ae=c(u,"CLAUDE.md");function O(o){U(o,{recursive:!0})}function L(){return c(X,"..","..")}var l=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(l||{}),T=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=l[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=l[e].padEnd(5),d=s.padEnd(6),_="";r?.correlationId?_=`[${r.correlationId}] `:r?.sessionId&&(_=`[session-${r.sessionId}] `);let E="";n!=null&&(this.level===0&&typeof n=="object"?E=`
|
||||
`+JSON.stringify(n,null,2):E=" "+this.formatData(n));let b="";if(r){let{sessionId:W,sdkSessionId:H,correlationId:j,...f}=r;Object.keys(f).length>0&&(b=` {${Object.entries(f).map(([k,y])=>`${k}=${y}`).join(", ")}}`)}let R=`[${i}] [${a}] [${d}] ${_}${t}${b}${E}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new T;var m=class{db;constructor(){O(p),this.db=new F(I),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -210,7 +210,13 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`).all(e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
`).all(e)}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${i}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
@@ -306,5 +312,5 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};import S from"path";import{existsSync as g}from"fs";import{spawn as P}from"child_process";var G=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),W=`http://127.0.0.1:${G}/health`;async function v(){try{return(await fetch(W,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function C(){try{if(await v())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=L(),e=S.join(o,"plugin","scripts","worker-service.cjs");if(!g(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=S.join(o,"ecosystem.config.cjs"),t=S.join(o,"node_modules",".bin","pm2");if(!g(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!g(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=P(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await v())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}async function k(o){console.error("[claude-mem cleanup] Hook fired",{input:o?{session_id:o.session_id,cwd:o.cwd,reason:o.reason}:null}),o||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=o;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s}),await C()||console.error("[claude-mem cleanup] Worker not available - skipping HTTP cleanup");let r=new m,n=r.findActiveSDKSession(e);n||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),r.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:n.id,sdk_session_id:n.sdk_session_id,project:n.project,worker_port:n.worker_port}),r.markSessionCompleted(n.id),console.error("[claude-mem cleanup] Session marked as completed in database"),r.close(),console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(N.isTTY)k(void 0);else{let o="";N.on("data",e=>o+=e),N.on("end",async()=>{let e=o?JSON.parse(o):void 0;await k(e)})}
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};import S from"path";import{existsSync as g}from"fs";import{spawn as P}from"child_process";var G=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),B=`http://127.0.0.1:${G}/health`;async function v(){try{return(await fetch(B,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function C(){try{if(await v())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=L(),e=S.join(o,"plugin","scripts","worker-service.cjs");if(!g(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=S.join(o,"ecosystem.config.cjs"),t=S.join(o,"node_modules",".bin","pm2");if(!g(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!g(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=P(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await v())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}async function D(o){console.error("[claude-mem cleanup] Hook fired",{input:o?{session_id:o.session_id,cwd:o.cwd,reason:o.reason}:null}),o||(console.log("No input provided - this script is designed to run as a Claude Code SessionEnd hook"),console.log(`
|
||||
Expected input format:`),console.log(JSON.stringify({session_id:"string",cwd:"string",transcript_path:"string",hook_event_name:"SessionEnd",reason:"exit"},null,2)),process.exit(0));let{session_id:e,reason:s}=o;console.error("[claude-mem cleanup] Searching for active SDK session",{session_id:e,reason:s}),await C()||console.error("[claude-mem cleanup] Worker not available - skipping HTTP cleanup");let r=new m,n=r.findActiveSDKSession(e);n||(console.error("[claude-mem cleanup] No active SDK session found",{session_id:e}),r.close(),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)),console.error("[claude-mem cleanup] Active SDK session found",{session_id:n.id,sdk_session_id:n.sdk_session_id,project:n.project,worker_port:n.worker_port}),r.markSessionCompleted(n.id),console.error("[claude-mem cleanup] Session marked as completed in database"),r.close(),console.error("[claude-mem cleanup] Cleanup completed successfully"),console.log('{"continue": true, "suppressOutput": true}'),process.exit(0)}if(N.isTTY)D(void 0);else{let o="";N.on("data",e=>o+=e),N.on("end",async()=>{let e=o?JSON.parse(o):void 0;await D(e)})}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import W from"path";import{stdin as P}from"process";import ce from"better-sqlite3";import{join as T,dirname as ne,basename as be}from"path";import{homedir as j}from"os";import{existsSync as Oe,mkdirSync as ie}from"fs";import{fileURLToPath as oe}from"url";function ae(){return typeof __dirname<"u"?__dirname:ne(oe(import.meta.url))}var de=ae(),I=process.env.CLAUDE_MEM_DATA_DIR||T(j(),".claude-mem"),U=process.env.CLAUDE_CONFIG_DIR||T(j(),".claude"),ve=T(I,"archives"),Ae=T(I,"logs"),ye=T(I,"trash"),Ce=T(I,"backups"),De=T(I,"settings.json"),Y=T(I,"claude-mem.db"),ke=T(U,"settings.json"),xe=T(U,"commands"),we=T(U,"CLAUDE.md");function K(a){ie(a,{recursive:!0})}function V(){return T(de,"..","..")}var $=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))($||{}),M=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=$[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,t){return`obs-${e}-${t}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Object.keys(e);return t.length===0?"{}":t.length<=3?JSON.stringify(e):`{${t.length} keys: ${t.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,t){if(!t)return e;try{let s=typeof t=="string"?JSON.parse(t):t;if(e==="Bash"&&s.command){let r=s.command.length>50?s.command.substring(0,50)+"...":s.command;return`${e}(${r})`}if(e==="Read"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}if(e==="Edit"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}if(e==="Write"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,t,s,r,o){if(e<this.level)return;let c=new Date().toISOString().replace("T"," ").substring(0,23),p=$[e].padEnd(5),u=t.padEnd(6),O="";r?.correlationId?O=`[${r.correlationId}] `:r?.sessionId&&(O=`[session-${r.sessionId}] `);let b="";o!=null&&(this.level===0&&typeof o=="object"?b=`
|
||||
`+JSON.stringify(o,null,2):b=" "+this.formatData(o));let n="";if(r){let{sessionId:h,sdkSessionId:k,correlationId:L,...C}=r;Object.keys(C).length>0&&(n=` {${Object.entries(C).map(([d,m])=>`${d}=${m}`).join(", ")}}`)}let N=`[${c}] [${p}] [${u}] ${O}${s}${n}${b}`;e===3?console.error(N):console.log(N)}debug(e,t,s,r){this.log(0,e,t,s,r)}info(e,t,s,r){this.log(1,e,t,s,r)}warn(e,t,s,r){this.log(2,e,t,s,r)}error(e,t,s,r){this.log(3,e,t,s,r)}dataIn(e,t,s,r){this.info(e,`\u2192 ${t}`,s,r)}dataOut(e,t,s,r){this.info(e,`\u2190 ${t}`,s,r)}success(e,t,s,r){this.info(e,`\u2713 ${t}`,s,r)}failure(e,t,s,r){this.error(e,`\u2717 ${t}`,s,r)}timing(e,t,s,r){this.info(e,`\u23F1 ${t}`,r,{duration:`${s}ms`})}},q=new M;var D=class{db;constructor(){K(I),this.db=new ce(Y),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import W from"path";import{stdin as P}from"process";import ce from"better-sqlite3";import{join as E,dirname as ne,basename as be}from"path";import{homedir as j}from"os";import{existsSync as Oe,mkdirSync as ie}from"fs";import{fileURLToPath as oe}from"url";function ae(){return typeof __dirname<"u"?__dirname:ne(oe(import.meta.url))}var de=ae(),b=process.env.CLAUDE_MEM_DATA_DIR||E(j(),".claude-mem"),U=process.env.CLAUDE_CONFIG_DIR||E(j(),".claude"),ve=E(b,"archives"),Ae=E(b,"logs"),ye=E(b,"trash"),Ce=E(b,"backups"),De=E(b,"settings.json"),Y=E(b,"claude-mem.db"),ke=E(b,"vector-db"),xe=E(U,"settings.json"),we=E(U,"commands"),Ue=E(U,"CLAUDE.md");function K(a){ie(a,{recursive:!0})}function V(){return E(de,"..","..")}var $=(i=>(i[i.DEBUG=0]="DEBUG",i[i.INFO=1]="INFO",i[i.WARN=2]="WARN",i[i.ERROR=3]="ERROR",i[i.SILENT=4]="SILENT",i))($||{}),M=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=$[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,t){return`obs-${e}-${t}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Object.keys(e);return t.length===0?"{}":t.length<=3?JSON.stringify(e):`{${t.length} keys: ${t.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,t){if(!t)return e;try{let s=typeof t=="string"?JSON.parse(t):t;if(e==="Bash"&&s.command){let r=s.command.length>50?s.command.substring(0,50)+"...":s.command;return`${e}(${r})`}if(e==="Read"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}if(e==="Edit"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}if(e==="Write"&&s.file_path){let r=s.file_path.split("/").pop()||s.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,t,s,r,i){if(e<this.level)return;let d=new Date().toISOString().replace("T"," ").substring(0,23),p=$[e].padEnd(5),u=t.padEnd(6),O="";r?.correlationId?O=`[${r.correlationId}] `:r?.sessionId&&(O=`[session-${r.sessionId}] `);let N="";i!=null&&(this.level===0&&typeof i=="object"?N=`
|
||||
`+JSON.stringify(i,null,2):N=" "+this.formatData(i));let n="";if(r){let{sessionId:h,sdkSessionId:k,correlationId:L,...C}=r;Object.keys(C).length>0&&(n=` {${Object.entries(C).map(([c,m])=>`${c}=${m}`).join(", ")}}`)}let R=`[${d}] [${p}] [${u}] ${O}${s}${n}${N}`;e===3?console.error(R):console.log(R)}debug(e,t,s,r){this.log(0,e,t,s,r)}info(e,t,s,r){this.log(1,e,t,s,r)}warn(e,t,s,r){this.log(2,e,t,s,r)}error(e,t,s,r){this.log(3,e,t,s,r)}dataIn(e,t,s,r){this.info(e,`\u2192 ${t}`,s,r)}dataOut(e,t,s,r){this.info(e,`\u2190 ${t}`,s,r)}success(e,t,s,r){this.info(e,`\u2713 ${t}`,s,r)}failure(e,t,s,r){this.error(e,`\u2717 ${t}`,s,r)}timing(e,t,s,r){this.info(e,`\u23F1 ${t}`,r,{duration:`${s}ms`})}},q=new M;var D=class{db;constructor(){K(b),this.db=new ce(Y),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -210,7 +210,13 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`).all(e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
`).all(e)}getObservationsByIds(e,t={}){if(e.length===0)return[];let{orderBy:s="date_desc",limit:r}=t,i=s==="date_asc"?"ASC":"DESC",d=r?`LIMIT ${r}`:"",p=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${p})
|
||||
ORDER BY created_at_epoch ${i}
|
||||
${d}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
@@ -222,7 +228,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
`).all(e),r=new Set,o=new Set;for(let c of s){if(c.files_read)try{let p=JSON.parse(c.files_read);Array.isArray(p)&&p.forEach(u=>r.add(u))}catch{}if(c.files_modified)try{let p=JSON.parse(c.files_modified);Array.isArray(p)&&p.forEach(u=>o.add(u))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(o)}}getSessionById(e){return this.db.prepare(`
|
||||
`).all(e),r=new Set,i=new Set;for(let d of s){if(d.files_read)try{let p=JSON.parse(d.files_read);Array.isArray(p)&&p.forEach(u=>r.add(u))}catch{}if(d.files_modified)try{let p=JSON.parse(d.files_modified);Array.isArray(p)&&p.forEach(u=>i.add(u))}catch{}}return{filesRead:Array.from(r),filesModified:Array.from(i)}}getSessionById(e){return this.db.prepare(`
|
||||
SELECT id, claude_session_id, sdk_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
@@ -249,11 +255,11 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||1}getPromptCounter(e){return this.db.prepare(`
|
||||
SELECT prompt_counter FROM sdk_sessions WHERE id = ?
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,t,s){let r=new Date,o=r.getTime(),p=this.db.prepare(`
|
||||
`).get(e)?.prompt_counter||0}createSDKSession(e,t,s){let r=new Date,i=r.getTime(),p=this.db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,t,s,r.toISOString(),o);return p.lastInsertRowid===0||p.changes===0?this.db.prepare(`
|
||||
`).run(e,e,t,s,r.toISOString(),i);return p.lastInsertRowid===0||p.changes===0?this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE claude_session_id = ? LIMIT 1
|
||||
`).get(e).id:p.lastInsertRowid}updateSDKSessionId(e,t){return this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
@@ -268,33 +274,33 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,t,s){let r=new Date,o=r.getTime();return this.db.prepare(`
|
||||
`).get(e)?.worker_port||null}saveUserPrompt(e,t,s){let r=new Date,i=r.getTime();return this.db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(claude_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`).run(e,t,s,r.toISOString(),o).lastInsertRowid}storeObservation(e,t,s,r){let o=new Date,c=o.getTime();this.db.prepare(`
|
||||
`).run(e,t,s,r.toISOString(),i).lastInsertRowid}storeObservation(e,t,s,r){let i=new Date,d=i.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,t,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
|
||||
`).run(e,e,t,i.toISOString(),d),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(sdk_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,t,s.type,s.title,s.subtitle,JSON.stringify(s.facts),s.narrative,JSON.stringify(s.concepts),JSON.stringify(s.files_read),JSON.stringify(s.files_modified),r||null,o.toISOString(),c)}storeSummary(e,t,s,r){let o=new Date,c=o.getTime();this.db.prepare(`
|
||||
`).run(e,t,s.type,s.title,s.subtitle,JSON.stringify(s.facts),s.narrative,JSON.stringify(s.concepts),JSON.stringify(s.files_read),JSON.stringify(s.files_modified),r||null,i.toISOString(),d)}storeSummary(e,t,s,r){let i=new Date,d=i.getTime();this.db.prepare(`
|
||||
SELECT id FROM sdk_sessions WHERE sdk_session_id = ?
|
||||
`).get(e)||(this.db.prepare(`
|
||||
INSERT INTO sdk_sessions
|
||||
(claude_session_id, sdk_session_id, project, started_at, started_at_epoch, status)
|
||||
VALUES (?, ?, ?, ?, ?, 'active')
|
||||
`).run(e,e,t,o.toISOString(),c),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
|
||||
`).run(e,e,t,i.toISOString(),d),console.error(`[SessionStore] Auto-created session record for session_id: ${e}`)),this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(sdk_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`).run(e,t,s.request,s.investigated,s.learned,s.completed,s.next_steps,s.notes,r||null,o.toISOString(),c)}markSessionCompleted(e){let t=new Date,s=t.getTime();this.db.prepare(`
|
||||
`).run(e,t,s.request,s.investigated,s.learned,s.completed,s.next_steps,s.notes,r||null,i.toISOString(),d)}markSessionCompleted(e){let t=new Date,s=t.getTime();this.db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'completed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE id = ?
|
||||
@@ -306,7 +312,7 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),t).changes}close(){this.db.close()}};import X from"path";import{existsSync as F}from"fs";import{spawn as pe}from"child_process";var ue=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),le=`http://127.0.0.1:${ue}/health`;async function J(){try{return(await fetch(le,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function Q(){try{if(await J())return!0;console.error("[claude-mem] Worker not responding, starting...");let a=V(),e=X.join(a,"plugin","scripts","worker-service.cjs");if(!F(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let t=X.join(a,"ecosystem.config.cjs"),s=X.join(a,"node_modules",".bin","pm2");if(!F(s))throw new Error(`PM2 binary not found at ${s}. This is a bundled dependency - try running: npm install`);if(!F(t))throw new Error(`PM2 ecosystem config not found at ${t}. Plugin installation may be corrupted.`);let r=pe(s,["start",t],{detached:!0,stdio:"ignore",cwd:a});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(c=>setTimeout(c,500)),await J())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(a){return console.error(`[claude-mem] Failed to start worker: ${a.message}`),!1}}var z=8,i={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function G(a){if(!a)return[];let e=JSON.parse(a);return Array.isArray(e)?e:[]}function me(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function _e(a){return new Date(a).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function Ee(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function Te(a){return a?Math.ceil(a.length/4):0}function he(a,e){return W.isAbsolute(a)?W.relative(e,a):a}function ge(a,e){if(e.length===0)return[];let t=e.map(()=>"?").join(",");return a.db.prepare(`
|
||||
`).run(e.toISOString(),t).changes}close(){this.db.close()}};import X from"path";import{existsSync as F}from"fs";import{spawn as pe}from"child_process";var ue=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),le=`http://127.0.0.1:${ue}/health`;async function J(){try{return(await fetch(le,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function Q(){try{if(await J())return!0;console.error("[claude-mem] Worker not responding, starting...");let a=V(),e=X.join(a,"plugin","scripts","worker-service.cjs");if(!F(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let t=X.join(a,"ecosystem.config.cjs"),s=X.join(a,"node_modules",".bin","pm2");if(!F(s))throw new Error(`PM2 binary not found at ${s}. This is a bundled dependency - try running: npm install`);if(!F(t))throw new Error(`PM2 ecosystem config not found at ${t}. Plugin installation may be corrupted.`);let r=pe(s,["start",t],{detached:!0,stdio:"ignore",cwd:a});r.on("error",i=>{throw new Error(`Failed to spawn PM2: ${i.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let i=0;i<3;i++)if(await new Promise(d=>setTimeout(d,500)),await J())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(a){return console.error(`[claude-mem] Failed to start worker: ${a.message}`),!1}}var z=8,o={reset:"\x1B[0m",bright:"\x1B[1m",dim:"\x1B[2m",cyan:"\x1B[36m",green:"\x1B[32m",yellow:"\x1B[33m",blue:"\x1B[34m",magenta:"\x1B[35m",gray:"\x1B[90m",red:"\x1B[31m"};function G(a){if(!a)return[];let e=JSON.parse(a);return Array.isArray(e)?e:[]}function me(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",hour:"numeric",minute:"2-digit",hour12:!0})}function _e(a){return new Date(a).toLocaleString("en-US",{hour:"numeric",minute:"2-digit",hour12:!0})}function Ee(a){return new Date(a).toLocaleString("en-US",{month:"short",day:"numeric",year:"numeric"})}function Te(a){return a?Math.ceil(a.length/4):0}function he(a,e){return W.isAbsolute(a)?W.relative(e,a):a}function ge(a,e){if(e.length===0)return[];let t=e.map(()=>"?").join(",");return a.db.prepare(`
|
||||
SELECT
|
||||
id, sdk_session_id, type, title, subtitle, narrative,
|
||||
facts, concepts, files_read, files_modified,
|
||||
@@ -314,18 +320,18 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let t=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id IN (${t})
|
||||
ORDER BY created_at_epoch DESC
|
||||
`).all(...e)}function Z(a,e=!1,t=!1){Q();let s=a?.cwd??process.cwd(),r=s?W.basename(s):"unknown-project",o=new D,c=o.db.prepare(`
|
||||
`).all(...e)}function Z(a,e=!1,t=!1){Q();let s=a?.cwd??process.cwd(),r=s?W.basename(s):"unknown-project",i=new D,d=i.db.prepare(`
|
||||
SELECT id, sdk_session_id, request, completed, next_steps, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(r,z+1);if(c.length===0)return o.close(),e?`
|
||||
${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}
|
||||
${i.gray}${"\u2500".repeat(60)}${i.reset}
|
||||
`).all(r,z+1);if(d.length===0)return i.close(),e?`
|
||||
${o.bright}${o.cyan}\u{1F4DD} [${r}] recent context${o.reset}
|
||||
${o.gray}${"\u2500".repeat(60)}${o.reset}
|
||||
|
||||
${i.dim}No previous sessions found for this project yet.${i.reset}
|
||||
${o.dim}No previous sessions found for this project yet.${o.reset}
|
||||
`:`# [${r}] recent context
|
||||
|
||||
No previous sessions found for this project yet.`;let p=c.slice(0,z),u=[...new Set(p.map(N=>N.sdk_session_id))],b=ge(o,u).filter(N=>{let h=G(N.concepts);return h.includes("what-changed")||h.includes("how-it-works")||h.includes("problem-solution")||h.includes("gotcha")||h.includes("discovery")||h.includes("why-it-exists")||h.includes("decision")||h.includes("trade-off")}),n=[];if(e?(n.push(""),n.push(`${i.bright}${i.cyan}\u{1F4DD} [${r}] recent context${i.reset}`),n.push(`${i.gray}${"\u2500".repeat(60)}${i.reset}`),n.push("")):(n.push(`# [${r}] recent context`),n.push("")),b.length>0){e?(n.push(`${i.dim}Legend: \u{1F3AF} session-request | \u{1F534} gotcha | \u{1F7E1} problem-solution | \u{1F535} how-it-works | \u{1F7E2} what-changed | \u{1F7E3} discovery | \u{1F7E0} why-it-exists | \u{1F7E4} decision | \u2696\uFE0F trade-off${i.reset}`),n.push("")):(n.push("**Legend:** \u{1F3AF} session-request | \u{1F534} gotcha | \u{1F7E1} problem-solution | \u{1F535} how-it-works | \u{1F7E2} what-changed | \u{1F7E3} discovery | \u{1F7E0} why-it-exists | \u{1F7E4} decision | \u2696\uFE0F trade-off"),n.push("")),e?(n.push(`${i.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${i.reset}`),n.push(`${i.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${i.reset}`),n.push(`${i.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${i.reset}`),n.push(`${i.dim} \u2192 Critical types (\u{1F534} gotcha, \u{1F7E4} decision, \u2696\uFE0F trade-off) often worth fetching immediately${i.reset}`),n.push("")):(n.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),n.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),n.push("- Prefer searching observations over re-reading code for past decisions and learnings"),n.push("- Critical types (\u{1F534} gotcha, \u{1F7E4} decision, \u2696\uFE0F trade-off) often worth fetching immediately"),n.push(""));let N=c[0]?.id,h=p.map((d,m)=>{let l=m===0?null:c[m+1];return{...d,displayEpoch:l?l.created_at_epoch:d.created_at_epoch,displayTime:l?l.created_at:d.created_at,isMostRecent:d.id===N}}),k=[...b.map(d=>({type:"observation",data:d})),...h.map(d=>({type:"summary",data:d}))];k.sort((d,m)=>{let l=d.type==="observation"?d.data.created_at_epoch:d.data.displayEpoch,R=m.type==="observation"?m.data.created_at_epoch:m.data.displayEpoch;return l-R});let L=new Map;for(let d of k){let m=d.type==="observation"?d.data.created_at:d.data.displayTime,l=Ee(m);L.has(l)||L.set(l,[]),L.get(l).push(d)}let C=Array.from(L.entries()).sort((d,m)=>{let l=new Date(d[0]).getTime(),R=new Date(m[0]).getTime();return l-R});for(let[d,m]of C){e?(n.push(`${i.bright}${i.cyan}${d}${i.reset}`),n.push("")):(n.push(`### ${d}`),n.push(""));let l=null,R="",v=!1;for(let x of m)if(x.type==="summary"){v&&(n.push(""),v=!1,l=null,R="");let _=x.data,A=`${_.request||"Session started"} (${me(_.displayTime)})`,S=_.isMostRecent?"":`claude-mem://session-summary/${_.id}`;if(e){let E=S?`${i.dim}[${S}]${i.reset}`:"";n.push(`\u{1F3AF} ${i.yellow}#S${_.id}${i.reset} ${A} ${E}`)}else{let E=S?` [\u2192](${S})`:"";n.push(`**\u{1F3AF} #S${_.id}** ${A}${E}`)}n.push("")}else{let _=x.data,A=G(_.files_modified),S=A.length>0?he(A[0],s):"General";S!==l&&(v&&n.push(""),e?n.push(`${i.dim}${S}${i.reset}`):n.push(`**${S}**`),e||(n.push("| ID | Time | T | Title | Tokens |"),n.push("|----|------|---|-------|--------|")),l=S,v=!0,R="");let E=G(_.concepts),f="\u2022";E.includes("gotcha")?f="\u{1F534}":E.includes("decision")?f="\u{1F7E4}":E.includes("trade-off")?f="\u2696\uFE0F":E.includes("problem-solution")?f="\u{1F7E1}":E.includes("discovery")?f="\u{1F7E3}":E.includes("why-it-exists")?f="\u{1F7E0}":E.includes("how-it-works")?f="\u{1F535}":E.includes("what-changed")&&(f="\u{1F7E2}");let y=_e(_.created_at),H=_.title||"Untitled",w=Te(_.narrative),B=y!==R,se=B?y:"";if(R=y,e){let te=B?`${i.dim}${y}${i.reset}`:" ".repeat(y.length),re=w>0?`${i.dim}(~${w}t)${i.reset}`:"";n.push(` ${i.dim}#${_.id}${i.reset} ${te} ${f} ${H} ${re}`)}else n.push(`| #${_.id} | ${se||"\u2033"} | ${f} | ${H} | ~${w} |`)}v&&n.push("")}let g=c[0];g&&(g.completed||g.next_steps)&&(g.completed&&(e?n.push(`${i.green}Completed:${i.reset} ${g.completed}`):n.push(`**Completed**: ${g.completed}`),n.push("")),g.next_steps&&(e?n.push(`${i.magenta}Next Steps:${i.reset} ${g.next_steps}`):n.push(`**Next Steps**: ${g.next_steps}`),n.push(""))),e?n.push(`${i.dim}Use claude-mem MCP search to access records with the given ID${i.reset}`):n.push("*Use claude-mem MCP search to access records with the given ID*")}return o.close(),n.join(`
|
||||
No previous sessions found for this project yet.`;let p=d.slice(0,z),u=[...new Set(p.map(R=>R.sdk_session_id))],N=ge(i,u).filter(R=>{let h=G(R.concepts);return h.includes("what-changed")||h.includes("how-it-works")||h.includes("problem-solution")||h.includes("gotcha")||h.includes("discovery")||h.includes("why-it-exists")||h.includes("decision")||h.includes("trade-off")}),n=[];if(e?(n.push(""),n.push(`${o.bright}${o.cyan}\u{1F4DD} [${r}] recent context${o.reset}`),n.push(`${o.gray}${"\u2500".repeat(60)}${o.reset}`),n.push("")):(n.push(`# [${r}] recent context`),n.push("")),N.length>0){e?(n.push(`${o.dim}Legend: \u{1F3AF} session-request | \u{1F534} gotcha | \u{1F7E1} problem-solution | \u{1F535} how-it-works | \u{1F7E2} what-changed | \u{1F7E3} discovery | \u{1F7E0} why-it-exists | \u{1F7E4} decision | \u2696\uFE0F trade-off${o.reset}`),n.push("")):(n.push("**Legend:** \u{1F3AF} session-request | \u{1F534} gotcha | \u{1F7E1} problem-solution | \u{1F535} how-it-works | \u{1F7E2} what-changed | \u{1F7E3} discovery | \u{1F7E0} why-it-exists | \u{1F7E4} decision | \u2696\uFE0F trade-off"),n.push("")),e?(n.push(`${o.dim}\u{1F4A1} Progressive Disclosure: This index shows WHAT exists (titles) and retrieval COST (token counts).${o.reset}`),n.push(`${o.dim} \u2192 Use MCP search tools to fetch full observation details on-demand (Layer 2)${o.reset}`),n.push(`${o.dim} \u2192 Prefer searching observations over re-reading code for past decisions and learnings${o.reset}`),n.push(`${o.dim} \u2192 Critical types (\u{1F534} gotcha, \u{1F7E4} decision, \u2696\uFE0F trade-off) often worth fetching immediately${o.reset}`),n.push("")):(n.push("\u{1F4A1} **Progressive Disclosure:** This index shows WHAT exists (titles) and retrieval COST (token counts)."),n.push("- Use MCP search tools to fetch full observation details on-demand (Layer 2)"),n.push("- Prefer searching observations over re-reading code for past decisions and learnings"),n.push("- Critical types (\u{1F534} gotcha, \u{1F7E4} decision, \u2696\uFE0F trade-off) often worth fetching immediately"),n.push(""));let R=d[0]?.id,h=p.map((c,m)=>{let l=m===0?null:d[m+1];return{...c,displayEpoch:l?l.created_at_epoch:c.created_at_epoch,displayTime:l?l.created_at:c.created_at,isMostRecent:c.id===R}}),k=[...N.map(c=>({type:"observation",data:c})),...h.map(c=>({type:"summary",data:c}))];k.sort((c,m)=>{let l=c.type==="observation"?c.data.created_at_epoch:c.data.displayEpoch,I=m.type==="observation"?m.data.created_at_epoch:m.data.displayEpoch;return l-I});let L=new Map;for(let c of k){let m=c.type==="observation"?c.data.created_at:c.data.displayTime,l=Ee(m);L.has(l)||L.set(l,[]),L.get(l).push(c)}let C=Array.from(L.entries()).sort((c,m)=>{let l=new Date(c[0]).getTime(),I=new Date(m[0]).getTime();return l-I});for(let[c,m]of C){e?(n.push(`${o.bright}${o.cyan}${c}${o.reset}`),n.push("")):(n.push(`### ${c}`),n.push(""));let l=null,I="",v=!1;for(let x of m)if(x.type==="summary"){v&&(n.push(""),v=!1,l=null,I="");let _=x.data,A=`${_.request||"Session started"} (${me(_.displayTime)})`,S=_.isMostRecent?"":`claude-mem://session-summary/${_.id}`;if(e){let T=S?`${o.dim}[${S}]${o.reset}`:"";n.push(`\u{1F3AF} ${o.yellow}#S${_.id}${o.reset} ${A} ${T}`)}else{let T=S?` [\u2192](${S})`:"";n.push(`**\u{1F3AF} #S${_.id}** ${A}${T}`)}n.push("")}else{let _=x.data,A=G(_.files_modified),S=A.length>0?he(A[0],s):"General";S!==l&&(v&&n.push(""),e?n.push(`${o.dim}${S}${o.reset}`):n.push(`**${S}**`),e||(n.push("| ID | Time | T | Title | Tokens |"),n.push("|----|------|---|-------|--------|")),l=S,v=!0,I="");let T=G(_.concepts),f="\u2022";T.includes("gotcha")?f="\u{1F534}":T.includes("decision")?f="\u{1F7E4}":T.includes("trade-off")?f="\u2696\uFE0F":T.includes("problem-solution")?f="\u{1F7E1}":T.includes("discovery")?f="\u{1F7E3}":T.includes("why-it-exists")?f="\u{1F7E0}":T.includes("how-it-works")?f="\u{1F535}":T.includes("what-changed")&&(f="\u{1F7E2}");let y=_e(_.created_at),B=_.title||"Untitled",w=Te(_.narrative),H=y!==I,se=H?y:"";if(I=y,e){let te=H?`${o.dim}${y}${o.reset}`:" ".repeat(y.length),re=w>0?`${o.dim}(~${w}t)${o.reset}`:"";n.push(` ${o.dim}#${_.id}${o.reset} ${te} ${f} ${B} ${re}`)}else n.push(`| #${_.id} | ${se||"\u2033"} | ${f} | ${B} | ~${w} |`)}v&&n.push("")}let g=d[0];g&&(g.completed||g.next_steps)&&(g.completed&&(e?n.push(`${o.green}Completed:${o.reset} ${g.completed}`):n.push(`**Completed**: ${g.completed}`),n.push("")),g.next_steps&&(e?n.push(`${o.magenta}Next Steps:${o.reset} ${g.next_steps}`):n.push(`**Next Steps**: ${g.next_steps}`),n.push(""))),e?n.push(`${o.dim}Use claude-mem MCP search to access records with the given ID${o.reset}`):n.push("*Use claude-mem MCP search to access records with the given ID*")}return i.close(),n.join(`
|
||||
`).trimEnd()}var ee=process.argv.includes("--index"),fe=process.argv.includes("--colors");if(P.isTTY||fe){let a=Z(void 0,!0,ee);console.log(a),process.exit(0)}else{let a="";P.on("data",e=>a+=e),P.on("end",()=>{let e=a.trim()?JSON.parse(a):void 0,s={hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:Z(e,!1,ee)}};console.log(JSON.stringify(s)),process.exit(0)})}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import j from"path";import{stdin as x}from"process";import G from"better-sqlite3";import{join as p,dirname as X,basename as z}from"path";import{homedir as h}from"os";import{existsSync as te,mkdirSync as M}from"fs";import{fileURLToPath as P}from"url";function F(){return typeof __dirname<"u"?__dirname:X(P(import.meta.url))}var H=F(),u=process.env.CLAUDE_MEM_DATA_DIR||p(h(),".claude-mem"),l=process.env.CLAUDE_CONFIG_DIR||p(h(),".claude"),ne=p(u,"archives"),oe=p(u,"logs"),ie=p(u,"trash"),ae=p(u,"backups"),de=p(u,"settings.json"),O=p(u,"claude-mem.db"),pe=p(l,"settings.json"),ce=p(l,"commands"),Ee=p(l,"CLAUDE.md");function I(o){M(o,{recursive:!0})}function L(){return p(H,"..","..")}var T=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(T||{}),S=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=T[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=T[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let c="";n!=null&&(this.level===0&&typeof n=="object"?c=`
|
||||
`+JSON.stringify(n,null,2):c=" "+this.formatData(n));let _="";if(r){let{sessionId:K,sdkSessionId:V,correlationId:q,...f}=r;Object.keys(f).length>0&&(_=` {${Object.entries(f).map(([U,w])=>`${U}=${w}`).join(", ")}}`)}let R=`[${i}] [${a}] [${d}] ${E}${t}${_}${c}`;e===3?console.error(R):console.log(R)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new S;var m=class{db;constructor(){I(u),this.db=new G(O),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import j from"path";import{stdin as x}from"process";import G from"better-sqlite3";import{join as p,dirname as M,basename as z}from"path";import{homedir as h}from"os";import{existsSync as te,mkdirSync as X}from"fs";import{fileURLToPath as P}from"url";function F(){return typeof __dirname<"u"?__dirname:M(P(import.meta.url))}var H=F(),E=process.env.CLAUDE_MEM_DATA_DIR||p(h(),".claude-mem"),l=process.env.CLAUDE_CONFIG_DIR||p(h(),".claude"),ne=p(E,"archives"),oe=p(E,"logs"),ie=p(E,"trash"),ae=p(E,"backups"),de=p(E,"settings.json"),O=p(E,"claude-mem.db"),pe=p(E,"vector-db"),ce=p(l,"settings.json"),Ee=p(l,"commands"),ue=p(l,"CLAUDE.md");function I(o){X(o,{recursive:!0})}function L(){return p(H,"..","..")}var T=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(T||{}),S=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=T[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=T[e].padEnd(5),d=s.padEnd(6),u="";r?.correlationId?u=`[${r.correlationId}] `:r?.sessionId&&(u=`[session-${r.sessionId}] `);let c="";n!=null&&(this.level===0&&typeof n=="object"?c=`
|
||||
`+JSON.stringify(n,null,2):c=" "+this.formatData(n));let _="";if(r){let{sessionId:K,sdkSessionId:V,correlationId:q,...f}=r;Object.keys(f).length>0&&(_=` {${Object.entries(f).map(([U,w])=>`${U}=${w}`).join(", ")}}`)}let N=`[${i}] [${a}] [${d}] ${u}${t}${_}${c}`;e===3?console.error(N):console.log(N)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},A=new S;var m=class{db;constructor(){I(E),this.db=new G(O),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -210,7 +210,13 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`).all(e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
`).all(e)}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${i}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
@@ -306,4 +312,4 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function W(o,e,s){return o==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:o==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:o==="UserPromptSubmit"||o==="PostToolUse"?{continue:!0,suppressOutput:!0}:o==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(o,e,s={}){let t=W(o,e,s);return JSON.stringify(t)}import g from"path";import{existsSync as b}from"fs";import{spawn as B}from"child_process";var k=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),$=`http://127.0.0.1:${k}/health`;async function C(){try{return(await fetch($,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function D(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=L(),e=g.join(o,"plugin","scripts","worker-service.cjs");if(!b(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=g.join(o,"ecosystem.config.cjs"),t=g.join(o,"node_modules",".bin","pm2");if(!b(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!b(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=B(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}function y(){return k}async function Y(o){if(!o)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=o,r=j.basename(s);if(!await D())throw new Error("Worker service failed to start or become healthy");let i=new m,a=i.createSDKSession(e,r,t),d=i.incrementPromptCounter(a);i.saveUserPrompt(e,d,t),console.error(`[new-hook] Session ${a}, prompt #${d}`),i.close();let E=y(),c=await fetch(`http://127.0.0.1:${E}/sessions/${a}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!c.ok){let _=await c.text();throw new Error(`Failed to initialize session: ${c.status} ${_}`)}console.log(v("UserPromptSubmit",!0))}var N="";x.on("data",o=>N+=o);x.on("end",async()=>{let o=N?JSON.parse(N):void 0;await Y(o)});
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function B(o,e,s){return o==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:o==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:o==="UserPromptSubmit"||o==="PostToolUse"?{continue:!0,suppressOutput:!0}:o==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(o,e,s={}){let t=B(o,e,s);return JSON.stringify(t)}import g from"path";import{existsSync as b}from"fs";import{spawn as W}from"child_process";var k=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),$=`http://127.0.0.1:${k}/health`;async function C(){try{return(await fetch($,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function D(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=L(),e=g.join(o,"plugin","scripts","worker-service.cjs");if(!b(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=g.join(o,"ecosystem.config.cjs"),t=g.join(o,"node_modules",".bin","pm2");if(!b(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!b(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=W(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}function y(){return k}async function Y(o){if(!o)throw new Error("newHook requires input");let{session_id:e,cwd:s,prompt:t}=o,r=j.basename(s);if(!await D())throw new Error("Worker service failed to start or become healthy");let i=new m,a=i.createSDKSession(e,r,t),d=i.incrementPromptCounter(a);i.saveUserPrompt(e,d,t),console.error(`[new-hook] Session ${a}, prompt #${d}`),i.close();let u=y(),c=await fetch(`http://127.0.0.1:${u}/sessions/${a}/init`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({project:r,userPrompt:t}),signal:AbortSignal.timeout(5e3)});if(!c.ok){let _=await c.text();throw new Error(`Failed to initialize session: ${c.status} ${_}`)}console.log(v("UserPromptSubmit",!0))}var R="";x.on("data",o=>R+=o);x.on("end",async()=>{let o=R?JSON.parse(R):void 0;await Y(o)});
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as D}from"process";import F from"better-sqlite3";import{join as p,dirname as U,basename as Q}from"path";import{homedir as I}from"os";import{existsSync as se,mkdirSync as w}from"fs";import{fileURLToPath as M}from"url";function X(){return typeof __dirname<"u"?__dirname:U(M(import.meta.url))}var P=X(),u=process.env.CLAUDE_MEM_DATA_DIR||p(I(),".claude-mem"),S=process.env.CLAUDE_CONFIG_DIR||p(I(),".claude"),re=p(u,"archives"),oe=p(u,"logs"),ne=p(u,"trash"),ie=p(u,"backups"),ae=p(u,"settings.json"),L=p(u,"claude-mem.db"),de=p(S,"settings.json"),pe=p(S,"commands"),ce=p(S,"CLAUDE.md");function A(n){w(n,{recursive:!0})}function v(){return p(P,"..","..")}var g=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(g||{}),b=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=g[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=g[e].padEnd(5),d=s.padEnd(6),c="";r?.correlationId?c=`[${r.correlationId}] `:r?.sessionId&&(c=`[session-${r.sessionId}] `);let E="";o!=null&&(this.level===0&&typeof o=="object"?E=`
|
||||
`+JSON.stringify(o,null,2):E=" "+this.formatData(o));let _="";if(r){let{sessionId:K,sdkSessionId:Y,correlationId:V,...h}=r;Object.keys(h).length>0&&(_=` {${Object.entries(h).map(([y,x])=>`${y}=${x}`).join(", ")}}`)}let l=`[${i}] [${a}] [${d}] ${c}${t}${_}${E}`;e===3?console.error(l):console.log(l)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},m=new b;var T=class{db;constructor(){A(u),this.db=new F(L),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
import{stdin as D}from"process";import F from"better-sqlite3";import{join as p,dirname as U,basename as Q}from"path";import{homedir as I}from"os";import{existsSync as se,mkdirSync as w}from"fs";import{fileURLToPath as M}from"url";function X(){return typeof __dirname<"u"?__dirname:U(M(import.meta.url))}var P=X(),c=process.env.CLAUDE_MEM_DATA_DIR||p(I(),".claude-mem"),S=process.env.CLAUDE_CONFIG_DIR||p(I(),".claude"),re=p(c,"archives"),oe=p(c,"logs"),ne=p(c,"trash"),ie=p(c,"backups"),ae=p(c,"settings.json"),L=p(c,"claude-mem.db"),de=p(c,"vector-db"),pe=p(S,"settings.json"),ce=p(S,"commands"),Ee=p(S,"CLAUDE.md");function A(n){w(n,{recursive:!0})}function v(){return p(P,"..","..")}var g=(o=>(o[o.DEBUG=0]="DEBUG",o[o.INFO=1]="INFO",o[o.WARN=2]="WARN",o[o.ERROR=3]="ERROR",o[o.SILENT=4]="SILENT",o))(g||{}),b=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=g[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,o){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=g[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let u="";o!=null&&(this.level===0&&typeof o=="object"?u=`
|
||||
`+JSON.stringify(o,null,2):u=" "+this.formatData(o));let _="";if(r){let{sessionId:K,sdkSessionId:Y,correlationId:V,...h}=r;Object.keys(h).length>0&&(_=` {${Object.entries(h).map(([y,x])=>`${y}=${x}`).join(", ")}}`)}let l=`[${i}] [${a}] [${d}] ${E}${t}${_}${u}`;e===3?console.error(l):console.log(l)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},m=new b;var T=class{db;constructor(){A(c),this.db=new F(L),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -210,7 +210,13 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`).all(e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
`).all(e)}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,o=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${o}
|
||||
${i}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
@@ -306,4 +312,4 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function H(n,e,s){return n==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:n==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:n==="UserPromptSubmit"||n==="PostToolUse"?{continue:!0,suppressOutput:!0}:n==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function N(n,e,s={}){let t=H(n,e,s);return JSON.stringify(t)}import R from"path";import{existsSync as f}from"fs";import{spawn as G}from"child_process";var W=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),B=`http://127.0.0.1:${W}/health`;async function C(){try{return(await fetch(B,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function k(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let n=v(),e=R.join(n,"plugin","scripts","worker-service.cjs");if(!f(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=R.join(n,"ecosystem.config.cjs"),t=R.join(n,"node_modules",".bin","pm2");if(!f(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!f(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:n});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(n){return console.error(`[claude-mem] Failed to start worker: ${n.message}`),!1}}var $=new Set(["ListMcpResourcesTool"]);async function j(n){if(!n)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=n;if($.has(s)){console.log(N("PostToolUse",!0));return}if(!await k())throw new Error("Worker service failed to start or become healthy");let i=new T,a=i.createSDKSession(e,"",""),d=i.getPromptCounter(a);i.close();let c=m.formatTool(s,t),E=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);m.dataIn("HOOK",`PostToolUse: ${c}`,{sessionId:a,workerPort:E});let _=await fetch(`http://127.0.0.1:${E}/sessions/${a}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:d}),signal:AbortSignal.timeout(2e3)});if(!_.ok){let l=await _.text();throw m.failure("HOOK","Failed to send observation",{sessionId:a,status:_.status},l),new Error(`Failed to send observation to worker: ${_.status} ${l}`)}m.debug("HOOK","Observation sent successfully",{sessionId:a,toolName:s}),console.log(N("PostToolUse",!0))}var O="";D.on("data",n=>O+=n);D.on("end",async()=>{let n=O?JSON.parse(O):void 0;await j(n)});
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function H(n,e,s){return n==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:n==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:n==="UserPromptSubmit"||n==="PostToolUse"?{continue:!0,suppressOutput:!0}:n==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function R(n,e,s={}){let t=H(n,e,s);return JSON.stringify(t)}import N from"path";import{existsSync as f}from"fs";import{spawn as G}from"child_process";var B=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),W=`http://127.0.0.1:${B}/health`;async function C(){try{return(await fetch(W,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function k(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let n=v(),e=N.join(n,"plugin","scripts","worker-service.cjs");if(!f(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=N.join(n,"ecosystem.config.cjs"),t=N.join(n,"node_modules",".bin","pm2");if(!f(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!f(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:n});r.on("error",o=>{throw new Error(`Failed to spawn PM2: ${o.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let o=0;o<3;o++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(n){return console.error(`[claude-mem] Failed to start worker: ${n.message}`),!1}}var $=new Set(["ListMcpResourcesTool"]);async function j(n){if(!n)throw new Error("saveHook requires input");let{session_id:e,tool_name:s,tool_input:t,tool_output:r}=n;if($.has(s)){console.log(R("PostToolUse",!0));return}if(!await k())throw new Error("Worker service failed to start or become healthy");let i=new T,a=i.createSDKSession(e,"",""),d=i.getPromptCounter(a);i.close();let E=m.formatTool(s,t),u=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);m.dataIn("HOOK",`PostToolUse: ${E}`,{sessionId:a,workerPort:u});let _=await fetch(`http://127.0.0.1:${u}/sessions/${a}/observations`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({tool_name:s,tool_input:t!==void 0?JSON.stringify(t):"{}",tool_output:r!==void 0?JSON.stringify(r):"{}",prompt_number:d}),signal:AbortSignal.timeout(2e3)});if(!_.ok){let l=await _.text();throw m.failure("HOOK","Failed to send observation",{sessionId:a,status:_.status},l),new Error(`Failed to send observation to worker: ${_.status} ${l}`)}m.debug("HOOK","Observation sent successfully",{sessionId:a,toolName:s}),console.log(R("PostToolUse",!0))}var O="";D.on("data",n=>O+=n);D.on("end",async()=>{let n=O?JSON.parse(O):void 0;await j(n)});
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env node
|
||||
import{stdin as D}from"process";import F from"better-sqlite3";import{join as p,dirname as U,basename as J}from"path";import{homedir as O}from"os";import{existsSync as ee,mkdirSync as w}from"fs";import{fileURLToPath as X}from"url";function M(){return typeof __dirname<"u"?__dirname:U(X(import.meta.url))}var P=M(),c=process.env.CLAUDE_MEM_DATA_DIR||p(O(),".claude-mem"),l=process.env.CLAUDE_CONFIG_DIR||p(O(),".claude"),te=p(c,"archives"),re=p(c,"logs"),ne=p(c,"trash"),oe=p(c,"backups"),ie=p(c,"settings.json"),I=p(c,"claude-mem.db"),ae=p(l,"settings.json"),de=p(l,"commands"),pe=p(l,"CLAUDE.md");function L(o){w(o,{recursive:!0})}function A(){return p(P,"..","..")}var T=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(T||{}),S=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=T[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
import{stdin as D}from"process";import P from"better-sqlite3";import{join as p,dirname as U,basename as J}from"path";import{homedir as O}from"os";import{existsSync as ee,mkdirSync as w}from"fs";import{fileURLToPath as M}from"url";function X(){return typeof __dirname<"u"?__dirname:U(M(import.meta.url))}var F=X(),c=process.env.CLAUDE_MEM_DATA_DIR||p(O(),".claude-mem"),l=process.env.CLAUDE_CONFIG_DIR||p(O(),".claude"),te=p(c,"archives"),re=p(c,"logs"),ne=p(c,"trash"),oe=p(c,"backups"),ie=p(c,"settings.json"),I=p(c,"claude-mem.db"),ae=p(c,"vector-db"),de=p(l,"settings.json"),pe=p(l,"commands"),ce=p(l,"CLAUDE.md");function L(o){w(o,{recursive:!0})}function A(){return p(F,"..","..")}var T=(n=>(n[n.DEBUG=0]="DEBUG",n[n.INFO=1]="INFO",n[n.WARN=2]="WARN",n[n.ERROR=3]="ERROR",n[n.SILENT=4]="SILENT",n))(T||{}),S=class{level;useColor;constructor(){let e=process.env.CLAUDE_MEM_LOG_LEVEL?.toUpperCase()||"INFO";this.level=T[e]??1,this.useColor=process.stdout.isTTY??!1}correlationId(e,s){return`obs-${e}-${s}`}sessionId(e){return`session-${e}`}formatData(e){if(e==null)return"";if(typeof e=="string")return e;if(typeof e=="number"||typeof e=="boolean")return e.toString();if(typeof e=="object"){if(e instanceof Error)return this.level===0?`${e.message}
|
||||
${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Object.keys(e);return s.length===0?"{}":s.length<=3?JSON.stringify(e):`{${s.length} keys: ${s.slice(0,3).join(", ")}...}`}return String(e)}formatTool(e,s){if(!s)return e;try{let t=typeof s=="string"?JSON.parse(s):s;if(e==="Bash"&&t.command){let r=t.command.length>50?t.command.substring(0,50)+"...":t.command;return`${e}(${r})`}if(e==="Read"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Edit"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}if(e==="Write"&&t.file_path){let r=t.file_path.split("/").pop()||t.file_path;return`${e}(${r})`}return e}catch{return e}}log(e,s,t,r,n){if(e<this.level)return;let i=new Date().toISOString().replace("T"," ").substring(0,23),a=T[e].padEnd(5),d=s.padEnd(6),E="";r?.correlationId?E=`[${r.correlationId}] `:r?.sessionId&&(E=`[session-${r.sessionId}] `);let _="";n!=null&&(this.level===0&&typeof n=="object"?_=`
|
||||
`+JSON.stringify(n,null,2):_=" "+this.formatData(n));let b="";if(r){let{sessionId:j,sdkSessionId:K,correlationId:Y,...h}=r;Object.keys(h).length>0&&(b=` {${Object.entries(h).map(([y,x])=>`${y}=${x}`).join(", ")}}`)}let f=`[${i}] [${a}] [${d}] ${E}${t}${b}${_}`;e===3?console.error(f):console.log(f)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},u=new S;var m=class{db;constructor(){L(c),this.db=new F(I),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
`+JSON.stringify(n,null,2):_=" "+this.formatData(n));let N="";if(r){let{sessionId:j,sdkSessionId:K,correlationId:Y,...h}=r;Object.keys(h).length>0&&(N=` {${Object.entries(h).map(([y,x])=>`${y}=${x}`).join(", ")}}`)}let f=`[${i}] [${a}] [${d}] ${E}${t}${N}${_}`;e===3?console.error(f):console.log(f)}debug(e,s,t,r){this.log(0,e,s,t,r)}info(e,s,t,r){this.log(1,e,s,t,r)}warn(e,s,t,r){this.log(2,e,s,t,r)}error(e,s,t,r){this.log(3,e,s,t,r)}dataIn(e,s,t,r){this.info(e,`\u2192 ${s}`,t,r)}dataOut(e,s,t,r){this.info(e,`\u2190 ${s}`,t,r)}success(e,s,t,r){this.info(e,`\u2713 ${s}`,t,r)}failure(e,s,t,r){this.error(e,`\u2717 ${s}`,t,r)}timing(e,s,t,r){this.info(e,`\u23F1 ${s}`,r,{duration:`${t}ms`})}},u=new S;var m=class{db;constructor(){L(c),this.db=new P(I),this.db.pragma("journal_mode = WAL"),this.db.pragma("synchronous = NORMAL"),this.db.pragma("foreign_keys = ON"),this.initializeSchema(),this.ensureWorkerPortColumn(),this.ensurePromptTrackingColumns(),this.removeSessionSummariesUniqueConstraint(),this.addObservationHierarchicalFields(),this.makeObservationsTextNullable(),this.createUserPromptsTable()}initializeSchema(){try{this.db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
@@ -210,7 +210,13 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
FROM observations
|
||||
WHERE sdk_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`).all(e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
`).all(e)}getObservationsByIds(e,s={}){if(e.length===0)return[];let{orderBy:t="date_desc",limit:r}=s,n=t==="date_asc"?"ASC":"DESC",i=r?`LIMIT ${r}`:"",a=e.map(()=>"?").join(",");return this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${a})
|
||||
ORDER BY created_at_epoch ${n}
|
||||
${i}
|
||||
`).all(...e)}getSummaryForSession(e){return this.db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
@@ -306,4 +312,4 @@ ${e.stack}`:e.message;if(Array.isArray(e))return`[${e.length} items]`;let s=Obje
|
||||
UPDATE sdk_sessions
|
||||
SET status = 'failed', completed_at = ?, completed_at_epoch = ?
|
||||
WHERE status = 'active'
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function H(o,e,s){return o==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:o==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:o==="UserPromptSubmit"||o==="PostToolUse"?{continue:!0,suppressOutput:!0}:o==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(o,e,s={}){let t=H(o,e,s);return JSON.stringify(t)}import g from"path";import{existsSync as R}from"fs";import{spawn as G}from"child_process";var W=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),B=`http://127.0.0.1:${W}/health`;async function C(){try{return(await fetch(B,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function k(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=A(),e=g.join(o,"plugin","scripts","worker-service.cjs");if(!R(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=g.join(o,"ecosystem.config.cjs"),t=g.join(o,"node_modules",".bin","pm2");if(!R(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!R(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}async function $(o){if(!o)throw new Error("summaryHook requires input");let{session_id:e}=o;if(!await k())throw new Error("Worker service failed to start or become healthy");let t=new m,r=t.createSDKSession(e,"",""),n=t.getPromptCounter(r);t.close();let i=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);u.dataIn("HOOK","Stop: Requesting summary",{sessionId:r,workerPort:i,promptNumber:n});let a=await fetch(`http://127.0.0.1:${i}/sessions/${r}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:n}),signal:AbortSignal.timeout(2e3)});if(!a.ok){let d=await a.text();throw u.failure("HOOK","Failed to generate summary",{sessionId:r,status:a.status},d),new Error(`Failed to request summary from worker: ${a.status} ${d}`)}u.debug("HOOK","Summary request sent successfully",{sessionId:r}),console.log(v("Stop",!0))}var N="";D.on("data",o=>N+=o);D.on("end",async()=>{let o=N?JSON.parse(N):void 0;await $(o)});
|
||||
`).run(e.toISOString(),s).changes}close(){this.db.close()}};function H(o,e,s){return o==="PreCompact"?e?{continue:!0,suppressOutput:!0}:{continue:!1,stopReason:s.reason||"Pre-compact operation failed",suppressOutput:!0}:o==="SessionStart"?e&&s.context?{continue:!0,suppressOutput:!0,hookSpecificOutput:{hookEventName:"SessionStart",additionalContext:s.context}}:{continue:!0,suppressOutput:!0}:o==="UserPromptSubmit"||o==="PostToolUse"?{continue:!0,suppressOutput:!0}:o==="Stop"?{continue:!0,suppressOutput:!0}:{continue:e,suppressOutput:!0,...s.reason&&!e?{stopReason:s.reason}:{}}}function v(o,e,s={}){let t=H(o,e,s);return JSON.stringify(t)}import g from"path";import{existsSync as R}from"fs";import{spawn as G}from"child_process";var B=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10),W=`http://127.0.0.1:${B}/health`;async function C(){try{return(await fetch(W,{signal:AbortSignal.timeout(500)})).ok}catch{return!1}}async function k(){try{if(await C())return!0;console.error("[claude-mem] Worker not responding, starting...");let o=A(),e=g.join(o,"plugin","scripts","worker-service.cjs");if(!R(e))return console.error(`[claude-mem] Worker service not found at ${e}`),!1;let s=g.join(o,"ecosystem.config.cjs"),t=g.join(o,"node_modules",".bin","pm2");if(!R(t))throw new Error(`PM2 binary not found at ${t}. This is a bundled dependency - try running: npm install`);if(!R(s))throw new Error(`PM2 ecosystem config not found at ${s}. Plugin installation may be corrupted.`);let r=G(t,["start",s],{detached:!0,stdio:"ignore",cwd:o});r.on("error",n=>{throw new Error(`Failed to spawn PM2: ${n.message}`)}),r.unref(),console.error("[claude-mem] Worker started with PM2");for(let n=0;n<3;n++)if(await new Promise(i=>setTimeout(i,500)),await C())return console.error("[claude-mem] Worker is healthy"),!0;return console.error("[claude-mem] Worker failed to become healthy after startup"),!1}catch(o){return console.error(`[claude-mem] Failed to start worker: ${o.message}`),!1}}async function $(o){if(!o)throw new Error("summaryHook requires input");let{session_id:e}=o;if(!await k())throw new Error("Worker service failed to start or become healthy");let t=new m,r=t.createSDKSession(e,"",""),n=t.getPromptCounter(r);t.close();let i=parseInt(process.env.CLAUDE_MEM_WORKER_PORT||"37777",10);u.dataIn("HOOK","Stop: Requesting summary",{sessionId:r,workerPort:i,promptNumber:n});let a=await fetch(`http://127.0.0.1:${i}/sessions/${r}/summarize`,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({prompt_number:n}),signal:AbortSignal.timeout(2e3)});if(!a.ok){let d=await a.text();throw u.failure("HOOK","Failed to generate summary",{sessionId:r,status:a.status},d),new Error(`Failed to request summary from worker: ${a.status} ${d}`)}u.debug("HOOK","Summary request sent successfully",{sessionId:r}),console.log(v("Stop",!0))}var b="";D.on("data",o=>b+=o);D.on("end",async()=>{let o=b?JSON.parse(b):void 0;await $(o)});
|
||||
|
||||
File diff suppressed because one or more lines are too long
+305
-14
@@ -5,6 +5,8 @@
|
||||
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
|
||||
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
|
||||
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
|
||||
import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio.js';
|
||||
import {
|
||||
CallToolRequestSchema,
|
||||
ListToolsRequestSchema,
|
||||
@@ -15,10 +17,14 @@ import { basename } from 'path';
|
||||
import { SessionSearch } from '../services/sqlite/SessionSearch.js';
|
||||
import { SessionStore } from '../services/sqlite/SessionStore.js';
|
||||
import { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult } from '../services/sqlite/types.js';
|
||||
import { VECTOR_DB_DIR } from '../shared/paths.js';
|
||||
|
||||
// Initialize search instance
|
||||
// Initialize search instances
|
||||
let search: SessionSearch;
|
||||
let store: SessionStore;
|
||||
let chromaClient: Client | null = null;
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
|
||||
try {
|
||||
search = new SessionSearch();
|
||||
store = new SessionStore();
|
||||
@@ -27,6 +33,95 @@ try {
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
/**
|
||||
* Query Chroma vector database via MCP
|
||||
* Parses Python dict-like responses from Chroma MCP server
|
||||
*/
|
||||
async function queryChroma(
|
||||
query: string,
|
||||
limit: number,
|
||||
whereFilter?: Record<string, any>
|
||||
): Promise<{ ids: number[]; distances: number[]; metadatas: any[] }> {
|
||||
if (!chromaClient) {
|
||||
throw new Error('Chroma client not initialized');
|
||||
}
|
||||
|
||||
const result = await chromaClient.callTool({
|
||||
name: 'chroma_query_documents',
|
||||
arguments: {
|
||||
collection_name: COLLECTION_NAME,
|
||||
query_texts: [query],
|
||||
n_results: limit,
|
||||
include: ['documents', 'metadatas', 'distances'],
|
||||
where: whereFilter
|
||||
}
|
||||
});
|
||||
|
||||
const resultText = result.content[0]?.text || '';
|
||||
|
||||
// Parse Python dict-like output using regex
|
||||
// Format: {'ids': [[...]], 'distances': [[...]], 'metadatas': [[...]]}
|
||||
|
||||
// Extract IDs (nested array format)
|
||||
const idsMatch = resultText.match(/'ids':\s*\[\[(.*?)\]\]/s);
|
||||
const ids: number[] = [];
|
||||
if (idsMatch) {
|
||||
const idsContent = idsMatch[1];
|
||||
// Match quoted strings (Chroma doc IDs like 'obs_123_title')
|
||||
const idMatches = idsContent.match(/'([^']*(?:\\'[^']*)*)'/g) || [];
|
||||
for (const idMatch of idMatches) {
|
||||
const docId = idMatch.slice(1, -1);
|
||||
// Extract sqlite_id from document ID (format: obs_{id}_title)
|
||||
const sqliteIdMatch = docId.match(/obs_(\d+)_/);
|
||||
if (sqliteIdMatch) {
|
||||
const sqliteId = parseInt(sqliteIdMatch[1], 10);
|
||||
if (!ids.includes(sqliteId)) {
|
||||
ids.push(sqliteId);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Extract distances (nested array format)
|
||||
const distancesMatch = resultText.match(/'distances':\s*\[\[([\d.,\s]+)\]\]/s);
|
||||
const distances: number[] = [];
|
||||
if (distancesMatch) {
|
||||
const distancesContent = distancesMatch[1];
|
||||
const distanceValues = distancesContent.split(',').map(d => parseFloat(d.trim())).filter(d => !isNaN(d));
|
||||
distances.push(...distanceValues);
|
||||
}
|
||||
|
||||
// Extract metadatas (nested array format)
|
||||
const metasMatch = resultText.match(/'metadatas':\s*\[\[(.*?)\]\]/s);
|
||||
const metadatas: any[] = [];
|
||||
if (metasMatch) {
|
||||
const metasContent = metasMatch[1];
|
||||
// Parse each metadata dict
|
||||
const metaObjMatches = metasContent.match(/\{[^}]+\}/g) || [];
|
||||
for (const metaStr of metaObjMatches) {
|
||||
const meta: any = {};
|
||||
// Extract sqlite_id
|
||||
const sqliteIdMatch = metaStr.match(/'sqlite_id':\s*(\d+)/);
|
||||
if (sqliteIdMatch) {
|
||||
meta.sqlite_id = parseInt(sqliteIdMatch[1], 10);
|
||||
}
|
||||
// Extract type
|
||||
const typeMatch = metaStr.match(/'type':\s*'([^']+)'/);
|
||||
if (typeMatch) {
|
||||
meta.type = typeMatch[1];
|
||||
}
|
||||
// Extract created_at_epoch
|
||||
const epochMatch = metaStr.match(/'created_at_epoch':\s*(\d+)/);
|
||||
if (epochMatch) {
|
||||
meta.created_at_epoch = parseInt(epochMatch[1], 10);
|
||||
}
|
||||
metadatas.push(meta);
|
||||
}
|
||||
}
|
||||
|
||||
return { ids, distances, metadatas };
|
||||
}
|
||||
|
||||
/**
|
||||
* Format search tips footer
|
||||
*/
|
||||
@@ -286,7 +381,45 @@ const tools = [
|
||||
handler: async (args: any) => {
|
||||
try {
|
||||
const { query, format = 'index', ...options } = args;
|
||||
const results = search.searchObservations(query, options);
|
||||
let results: ObservationSearchResult[] = [];
|
||||
|
||||
// Hybrid search: Try Chroma semantic search first, fall back to FTS5
|
||||
if (chromaClient) {
|
||||
try {
|
||||
console.error('[search-server] Using hybrid semantic search (Chroma + SQLite)');
|
||||
|
||||
// Step 1: Chroma semantic search (top 100)
|
||||
const chromaResults = await queryChroma(query, 100);
|
||||
console.error(`[search-server] Chroma returned ${chromaResults.ids.length} semantic matches`);
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Math.floor(Date.now() / 1000) - (90 * 24 * 60 * 60);
|
||||
const recentIds = chromaResults.ids.filter((id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
});
|
||||
|
||||
console.error(`[search-server] ${recentIds.length} results within 90-day window`);
|
||||
|
||||
// Step 3: Hydrate from SQLite in temporal order
|
||||
if (recentIds.length > 0) {
|
||||
const limit = options.limit || 20;
|
||||
results = store.getObservationsByIds(recentIds, { orderBy: 'date_desc', limit });
|
||||
console.error(`[search-server] Hydrated ${results.length} observations from SQLite`);
|
||||
}
|
||||
}
|
||||
} catch (chromaError: any) {
|
||||
console.error('[search-server] Chroma query failed, falling back to FTS5:', chromaError.message);
|
||||
// Fall through to FTS5 fallback
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to FTS5 if Chroma unavailable or returned no results
|
||||
if (results.length === 0) {
|
||||
console.error('[search-server] Using FTS5 keyword search');
|
||||
results = search.searchObservations(query, options);
|
||||
}
|
||||
|
||||
if (results.length === 0) {
|
||||
return {
|
||||
@@ -400,7 +533,50 @@ const tools = [
|
||||
handler: async (args: any) => {
|
||||
try {
|
||||
const { concept, format = 'index', ...filters } = args;
|
||||
const results = search.findByConcept(concept, filters);
|
||||
let results: ObservationSearchResult[] = [];
|
||||
|
||||
// Metadata-first, semantic-enhanced search
|
||||
if (chromaClient) {
|
||||
try {
|
||||
console.error('[search-server] Using metadata-first + semantic ranking for concept search');
|
||||
|
||||
// Step 1: SQLite metadata filter (get all IDs with this concept)
|
||||
const metadataResults = search.findByConcept(concept, filters);
|
||||
console.error(`[search-server] Found ${metadataResults.length} observations with concept "${concept}"`);
|
||||
|
||||
if (metadataResults.length > 0) {
|
||||
// Step 2: Chroma semantic ranking (rank by relevance to concept)
|
||||
const ids = metadataResults.map(obs => obs.id);
|
||||
const chromaResults = await queryChroma(concept, Math.min(ids.length, 100));
|
||||
|
||||
// Intersect: Keep only IDs that passed metadata filter, in semantic rank order
|
||||
const rankedIds: number[] = [];
|
||||
for (const chromaId of chromaResults.ids) {
|
||||
if (ids.includes(chromaId) && !rankedIds.includes(chromaId)) {
|
||||
rankedIds.push(chromaId);
|
||||
}
|
||||
}
|
||||
|
||||
console.error(`[search-server] Chroma ranked ${rankedIds.length} results by semantic relevance`);
|
||||
|
||||
// Step 3: Hydrate in semantic rank order
|
||||
if (rankedIds.length > 0) {
|
||||
results = store.getObservationsByIds(rankedIds, { limit: filters.limit || 20 });
|
||||
// Restore semantic ranking order
|
||||
results.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
}
|
||||
}
|
||||
} catch (chromaError: any) {
|
||||
console.error('[search-server] Chroma ranking failed, using SQLite order:', chromaError.message);
|
||||
// Fall through to SQLite fallback
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to SQLite-only if Chroma unavailable or failed
|
||||
if (results.length === 0) {
|
||||
console.error('[search-server] Using SQLite-only concept search');
|
||||
results = search.findByConcept(concept, filters);
|
||||
}
|
||||
|
||||
if (results.length === 0) {
|
||||
return {
|
||||
@@ -457,9 +633,59 @@ const tools = [
|
||||
handler: async (args: any) => {
|
||||
try {
|
||||
const { filePath, format = 'index', ...filters } = args;
|
||||
const results = search.findByFile(filePath, filters);
|
||||
let observations: ObservationSearchResult[] = [];
|
||||
let sessions: SessionSummarySearchResult[] = [];
|
||||
|
||||
const totalResults = results.observations.length + results.sessions.length;
|
||||
// Metadata-first, semantic-enhanced search for observations
|
||||
if (chromaClient) {
|
||||
try {
|
||||
console.error('[search-server] Using metadata-first + semantic ranking for file search');
|
||||
|
||||
// Step 1: SQLite metadata filter (get all results with this file)
|
||||
const metadataResults = search.findByFile(filePath, filters);
|
||||
console.error(`[search-server] Found ${metadataResults.observations.length} observations, ${metadataResults.sessions.length} sessions for file "${filePath}"`);
|
||||
|
||||
// Sessions: Keep as-is (already summarized, no semantic ranking needed)
|
||||
sessions = metadataResults.sessions;
|
||||
|
||||
// Observations: Apply semantic ranking
|
||||
if (metadataResults.observations.length > 0) {
|
||||
// Step 2: Chroma semantic ranking (rank by relevance to file path)
|
||||
const ids = metadataResults.observations.map(obs => obs.id);
|
||||
const chromaResults = await queryChroma(filePath, Math.min(ids.length, 100));
|
||||
|
||||
// Intersect: Keep only IDs that passed metadata filter, in semantic rank order
|
||||
const rankedIds: number[] = [];
|
||||
for (const chromaId of chromaResults.ids) {
|
||||
if (ids.includes(chromaId) && !rankedIds.includes(chromaId)) {
|
||||
rankedIds.push(chromaId);
|
||||
}
|
||||
}
|
||||
|
||||
console.error(`[search-server] Chroma ranked ${rankedIds.length} observations by semantic relevance`);
|
||||
|
||||
// Step 3: Hydrate in semantic rank order
|
||||
if (rankedIds.length > 0) {
|
||||
observations = store.getObservationsByIds(rankedIds, { limit: filters.limit || 20 });
|
||||
// Restore semantic ranking order
|
||||
observations.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
}
|
||||
}
|
||||
} catch (chromaError: any) {
|
||||
console.error('[search-server] Chroma ranking failed, using SQLite order:', chromaError.message);
|
||||
// Fall through to SQLite fallback
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to SQLite-only if Chroma unavailable or failed
|
||||
if (observations.length === 0 && sessions.length === 0) {
|
||||
console.error('[search-server] Using SQLite-only file search');
|
||||
const results = search.findByFile(filePath, filters);
|
||||
observations = results.observations;
|
||||
sessions = results.sessions;
|
||||
}
|
||||
|
||||
const totalResults = observations.length + sessions.length;
|
||||
|
||||
if (totalResults === 0) {
|
||||
return {
|
||||
@@ -476,13 +702,13 @@ const tools = [
|
||||
const formattedResults: string[] = [];
|
||||
|
||||
// Add observations
|
||||
results.observations.forEach((obs, i) => {
|
||||
observations.forEach((obs, i) => {
|
||||
formattedResults.push(formatObservationIndex(obs, i));
|
||||
});
|
||||
|
||||
// Add sessions
|
||||
results.sessions.forEach((session, i) => {
|
||||
formattedResults.push(formatSessionIndex(session, i + results.observations.length));
|
||||
sessions.forEach((session, i) => {
|
||||
formattedResults.push(formatSessionIndex(session, i + observations.length));
|
||||
});
|
||||
|
||||
combinedText = header + formattedResults.join('\n\n') + formatSearchTips();
|
||||
@@ -490,13 +716,13 @@ const tools = [
|
||||
const formattedResults: string[] = [];
|
||||
|
||||
// Add observations
|
||||
results.observations.forEach((obs, i) => {
|
||||
observations.forEach((obs, i) => {
|
||||
formattedResults.push(formatObservationResult(obs, i));
|
||||
});
|
||||
|
||||
// Add sessions
|
||||
results.sessions.forEach((session, i) => {
|
||||
formattedResults.push(formatSessionResult(session, i + results.observations.length));
|
||||
sessions.forEach((session, i) => {
|
||||
formattedResults.push(formatSessionResult(session, i + observations.length));
|
||||
});
|
||||
|
||||
combinedText = formattedResults.join('\n\n---\n\n');
|
||||
@@ -540,10 +766,53 @@ const tools = [
|
||||
handler: async (args: any) => {
|
||||
try {
|
||||
const { type, format = 'index', ...filters } = args;
|
||||
const results = search.findByType(type, filters);
|
||||
const typeStr = Array.isArray(type) ? type.join(', ') : type;
|
||||
let results: ObservationSearchResult[] = [];
|
||||
|
||||
// Metadata-first, semantic-enhanced search
|
||||
if (chromaClient) {
|
||||
try {
|
||||
console.error('[search-server] Using metadata-first + semantic ranking for type search');
|
||||
|
||||
// Step 1: SQLite metadata filter (get all IDs with this type)
|
||||
const metadataResults = search.findByType(type, filters);
|
||||
console.error(`[search-server] Found ${metadataResults.length} observations with type "${typeStr}"`);
|
||||
|
||||
if (metadataResults.length > 0) {
|
||||
// Step 2: Chroma semantic ranking (rank by relevance to type)
|
||||
const ids = metadataResults.map(obs => obs.id);
|
||||
const chromaResults = await queryChroma(typeStr, Math.min(ids.length, 100));
|
||||
|
||||
// Intersect: Keep only IDs that passed metadata filter, in semantic rank order
|
||||
const rankedIds: number[] = [];
|
||||
for (const chromaId of chromaResults.ids) {
|
||||
if (ids.includes(chromaId) && !rankedIds.includes(chromaId)) {
|
||||
rankedIds.push(chromaId);
|
||||
}
|
||||
}
|
||||
|
||||
console.error(`[search-server] Chroma ranked ${rankedIds.length} results by semantic relevance`);
|
||||
|
||||
// Step 3: Hydrate in semantic rank order
|
||||
if (rankedIds.length > 0) {
|
||||
results = store.getObservationsByIds(rankedIds, { limit: filters.limit || 20 });
|
||||
// Restore semantic ranking order
|
||||
results.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
}
|
||||
}
|
||||
} catch (chromaError: any) {
|
||||
console.error('[search-server] Chroma ranking failed, using SQLite order:', chromaError.message);
|
||||
// Fall through to SQLite fallback
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to SQLite-only if Chroma unavailable or failed
|
||||
if (results.length === 0) {
|
||||
console.error('[search-server] Using SQLite-only type search');
|
||||
results = search.findByType(type, filters);
|
||||
}
|
||||
|
||||
if (results.length === 0) {
|
||||
const typeStr = Array.isArray(type) ? type.join(', ') : type;
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
@@ -553,7 +822,6 @@ const tools = [
|
||||
}
|
||||
|
||||
// Format based on requested format
|
||||
const typeStr = Array.isArray(type) ? type.join(', ') : type;
|
||||
let combinedText: string;
|
||||
if (format === 'index') {
|
||||
const header = `Found ${results.length} observation(s) with type "${typeStr}":\n\n`;
|
||||
@@ -827,6 +1095,29 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
||||
|
||||
// Start the server
|
||||
async function main() {
|
||||
// Initialize Chroma client
|
||||
try {
|
||||
console.error('[search-server] Initializing Chroma client...');
|
||||
const chromaTransport = new StdioClientTransport({
|
||||
command: 'uvx',
|
||||
args: ['chroma-mcp', '--client-type', 'persistent', '--data-dir', VECTOR_DB_DIR]
|
||||
});
|
||||
|
||||
chromaClient = new Client({
|
||||
name: 'claude-mem-search-chroma-client',
|
||||
version: '1.0.0'
|
||||
}, {
|
||||
capabilities: {}
|
||||
});
|
||||
|
||||
await chromaClient.connect(chromaTransport);
|
||||
console.error('[search-server] Chroma client connected successfully');
|
||||
} catch (error: any) {
|
||||
console.error('[search-server] Failed to initialize Chroma client:', error.message);
|
||||
console.error('[search-server] Falling back to FTS5-only search');
|
||||
chromaClient = null;
|
||||
}
|
||||
|
||||
const transport = new StdioServerTransport();
|
||||
await server.connect(transport);
|
||||
console.error('[search-server] Claude-mem search server started');
|
||||
|
||||
@@ -616,6 +616,33 @@ export class SessionStore {
|
||||
return stmt.all(sdkSessionId) as any[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get observations by array of IDs with ordering and limit
|
||||
*/
|
||||
getObservationsByIds(
|
||||
ids: number[],
|
||||
options: { orderBy?: 'date_desc' | 'date_asc'; limit?: number } = {}
|
||||
): any[] {
|
||||
if (ids.length === 0) return [];
|
||||
|
||||
const { orderBy = 'date_desc', limit } = options;
|
||||
const orderClause = orderBy === 'date_asc' ? 'ASC' : 'DESC';
|
||||
const limitClause = limit ? `LIMIT ${limit}` : '';
|
||||
|
||||
// Build placeholders for IN clause
|
||||
const placeholders = ids.map(() => '?').join(',');
|
||||
|
||||
const stmt = this.db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id IN (${placeholders})
|
||||
ORDER BY created_at_epoch ${orderClause}
|
||||
${limitClause}
|
||||
`);
|
||||
|
||||
return stmt.all(...ids) as any[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get summary for a specific session
|
||||
*/
|
||||
|
||||
@@ -32,6 +32,7 @@ export const TRASH_DIR = join(DATA_DIR, 'trash');
|
||||
export const BACKUPS_DIR = join(DATA_DIR, 'backups');
|
||||
export const USER_SETTINGS_PATH = join(DATA_DIR, 'settings.json');
|
||||
export const DB_PATH = join(DATA_DIR, 'claude-mem.db');
|
||||
export const VECTOR_DB_DIR = join(DATA_DIR, 'vector-db');
|
||||
|
||||
// Claude integration paths
|
||||
export const CLAUDE_SETTINGS_PATH = join(CLAUDE_CONFIG_DIR, 'settings.json');
|
||||
|
||||
Reference in New Issue
Block a user