c6f932988a
* MAESTRO: fix ChromaDB core issues — Python pinning, Windows paths, disable toggle, metadata sanitization, transport errors - Add --python version pinning to uvx args in both local and remote mode (fixes #1196, #1206, #1208) - Convert backslash paths to forward slashes for --data-dir on Windows (fixes #1199) - Add CLAUDE_MEM_CHROMA_ENABLED setting for SQLite-only fallback mode (fixes #707) - Sanitize metadata in addDocuments() to filter null/undefined/empty values (fixes #1183, #1188) - Wrap callTool() in try/catch for transport errors with auto-reconnect (fixes #1162) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix data integrity — content-hash deduplication, project name collision, empty project guard, stuck isProcessing - Add SHA-256 content-hash deduplication to observations INSERT (store.ts, transactions.ts, SessionStore.ts) - Add content_hash column via migration 22 with backfill and index - Fix project name collision: getCurrentProjectName() now returns parent/basename - Guard against empty project string with cwd-derived fallback - Fix stuck isProcessing: hasAnyPendingWork() resets processing messages older than 5 minutes - Add 12 new tests covering all four fixes Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix hook lifecycle — stderr suppression, output isolation, conversation pollution prevention - Suppress process.stderr.write in hookCommand() to prevent Claude Code showing diagnostic output as error UI (#1181). Restores stderr in finally block for worker-continues case. - Convert console.error() to logger.warn()/error() in hook-command.ts and handlers/index.ts so all diagnostics route to log file instead of stderr. - Verified all 7 handlers return suppressOutput: true (prevents conversation pollution #598, #784). - Verified session-complete is a recognized event type (fixes #984). - Verified unknown event types return no-op handler with exit 0 (graceful degradation). - Added 10 new tests in tests/hook-lifecycle.test.ts covering event dispatch, adapter defaults, stderr suppression, and standard response constants. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix worker lifecycle — restart loop coordination, stale transport retry, ENOENT shutdown race - Add PID file mtime guard to prevent concurrent restart storms (#1145): isPidFileRecent() + touchPidFile() coordinate across sessions - Add transparent retry in ChromaMcpManager.callTool() on transport error — reconnects and retries once instead of failing (#1131) - Wrap getInstalledPluginVersion() with ENOENT/EBUSY handling (#1042) - Verified ChromaMcpManager.stop() already called on all shutdown paths Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix Windows platform support — uvx.cmd spawn, PowerShell $_ elimination, windowsHide, FTS5 fallback - Route uvx spawn through cmd.exe /c on Windows since MCP SDK lacks shell:true (#1190, #1192, #1199) - Replace all PowerShell Where-Object {$_} pipelines with WQL -Filter server-side filtering (#1024, #1062) - Add windowsHide: true to all exec/spawn calls missing it to prevent console popups (#1048) - Add FTS5 runtime probe with graceful fallback when unavailable on Windows (#791) - Guard FTS5 table creation in migrations, SessionSearch, and SessionStore with try/catch Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix skills/ distribution — build-time verification and regression tests (#1187) Add post-build verification in build-hooks.js that fails if critical distribution files (skills, hooks, plugin manifest) are missing. Add 10 regression tests covering skill file presence, YAML frontmatter, hooks.json integrity, and package.json files field. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix MigrationRunner schema initialization (#979) — version conflict between parallel migration systems Root cause: old DatabaseManager migrations 1-7 shared schema_versions table with MigrationRunner's 4-22, causing version number collisions (5=drop tables vs add column, 6=FTS5 vs prompt tracking, 7=discovery_tokens vs remove UNIQUE). initializeSchema() was gated behind maxApplied===0, so core tables were never created when old versions were present. Fixes: - initializeSchema() always creates core tables via CREATE TABLE IF NOT EXISTS - Migrations 5-7 check actual DB state (columns/constraints) not just version tracking - Crash-safe temp table rebuilds (DROP IF EXISTS _new before CREATE) - Added missing migration 21 (ON UPDATE CASCADE) to MigrationRunner - Added ON UPDATE CASCADE to FK definitions in initializeSchema() - All changes applied to both runner.ts and SessionStore.ts Tests: 13 new tests in migration-runner.test.ts covering fresh DB, idempotency, version conflicts, crash recovery, FK constraints, and data integrity. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix 21 test failures — stale mocks, outdated assertions, missing OpenClaw guards Server tests (12): Added missing workerPath and getAiStatus to ServerOptions mocks after interface expansion. ChromaSync tests (3): Updated to verify transport cleanup in ChromaMcpManager after architecture refactor. OpenClaw (2): Added memory_ tool skipping and response truncation to prevent recursive loops and oversized payloads. MarkdownFormatter (2): Updated assertions to match current output. SettingsDefaultsManager (1): Used correct default key for getBool test. Logger standards (1): Excluded CLI transcript command from background service check. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix Codex CLI compatibility (#744) — session_id fallbacks, unknown platform tolerance, undefined guard Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix Cursor IDE integration (#838, #1049) — adapter field fallbacks, tolerant session-init validation Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix /api/logs OOM (#1203) — tail-read replaces full-file readFileSync Replace readFileSync (loads entire file into memory) with readLastLines() that reads only from the end of the file in expanding chunks (64KB → 10MB cap). Prevents OOM on large log files while preserving the same API response shape. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix Settings CORS error (#1029) — explicit methods and allowedHeaders in CORS config Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: add session custom_title for agent attribution (#1213) — migration 23, endpoint + store support Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: prevent CLAUDE.md/AGENTS.md writes inside .git/ directories (#1165) Add .git path guard to all 4 write sites to prevent ref corruption when paths resolve inside .git internals. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix plugin disabled state not respected (#781) — early exit check in all hook entry points Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix UserPromptSubmit context re-injection on every turn (#1079) — contextInjected session flag Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * MAESTRO: fix stale AbortController queue stall (#1099) — lastGeneratorActivity tracking + 30s timeout Three-layer fix: 1. Added lastGeneratorActivity timestamp to ActiveSession, updated by processAgentResponse (all agents), getMessageIterator (queue yields), and startGeneratorWithProvider (generator launch) 2. Added stale generator detection in ensureGeneratorRunning — if no activity for >30s, aborts stale controller, resets state, restarts 3. Added AbortSignal.timeout(30000) in deleteSession to prevent indefinite hang when awaiting a stuck generator promise Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
544 lines
17 KiB
TypeScript
544 lines
17 KiB
TypeScript
#!/usr/bin/env bun
|
|
/**
|
|
* Regenerate CLAUDE.md files for folders in the current project
|
|
*
|
|
* Usage:
|
|
* bun scripts/regenerate-claude-md.ts [--dry-run] [--clean]
|
|
*
|
|
* Options:
|
|
* --dry-run Show what would be done without writing files
|
|
* --clean Remove auto-generated CLAUDE.md files instead of regenerating
|
|
*
|
|
* Behavior:
|
|
* - Scopes to current working directory (not entire database history)
|
|
* - Uses git ls-files to respect .gitignore (skips node_modules, .git, etc.)
|
|
* - Only processes folders that exist within the current project
|
|
* - Filters database to current project observations only
|
|
*/
|
|
|
|
import { Database } from 'bun:sqlite';
|
|
import path from 'path';
|
|
import os from 'os';
|
|
import { existsSync, mkdirSync, writeFileSync, readFileSync, renameSync, unlinkSync, readdirSync } from 'fs';
|
|
import { execSync } from 'child_process';
|
|
import { SettingsDefaultsManager } from '../src/shared/SettingsDefaultsManager.js';
|
|
|
|
const DB_PATH = path.join(os.homedir(), '.claude-mem', 'claude-mem.db');
|
|
const SETTINGS_PATH = path.join(os.homedir(), '.claude-mem', 'settings.json');
|
|
const settings = SettingsDefaultsManager.loadFromFile(SETTINGS_PATH);
|
|
const OBSERVATION_LIMIT = parseInt(settings.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10) || 50;
|
|
|
|
interface ObservationRow {
|
|
id: number;
|
|
title: string | null;
|
|
subtitle: string | null;
|
|
narrative: string | null;
|
|
facts: string | null;
|
|
type: string;
|
|
created_at: string;
|
|
created_at_epoch: number;
|
|
files_modified: string | null;
|
|
files_read: string | null;
|
|
project: string;
|
|
discovery_tokens: number | null;
|
|
}
|
|
|
|
// Import shared utilities
|
|
import { formatTime, groupByDate } from '../src/shared/timeline-formatting.js';
|
|
import { isDirectChild } from '../src/shared/path-utils.js';
|
|
import { replaceTaggedContent } from '../src/utils/claude-md-utils.js';
|
|
|
|
// Type icon map (matches ModeManager)
|
|
const TYPE_ICONS: Record<string, string> = {
|
|
'bugfix': '🔴',
|
|
'feature': '🟣',
|
|
'refactor': '🔄',
|
|
'change': '✅',
|
|
'discovery': '🔵',
|
|
'decision': '⚖️',
|
|
'session': '🎯',
|
|
'prompt': '💬'
|
|
};
|
|
|
|
function getTypeIcon(type: string): string {
|
|
return TYPE_ICONS[type] || '📝';
|
|
}
|
|
|
|
function estimateTokens(obs: ObservationRow): number {
|
|
const size = (obs.title?.length || 0) +
|
|
(obs.subtitle?.length || 0) +
|
|
(obs.narrative?.length || 0) +
|
|
(obs.facts?.length || 0);
|
|
return Math.ceil(size / 4);
|
|
}
|
|
|
|
/**
|
|
* Get tracked folders using git ls-files
|
|
* This respects .gitignore and only returns folders within the project
|
|
*/
|
|
function getTrackedFolders(workingDir: string): Set<string> {
|
|
const folders = new Set<string>();
|
|
|
|
try {
|
|
// Get all tracked files using git ls-files
|
|
const output = execSync('git ls-files', {
|
|
cwd: workingDir,
|
|
encoding: 'utf-8',
|
|
maxBuffer: 50 * 1024 * 1024 // 50MB buffer for large repos
|
|
});
|
|
|
|
const files = output.trim().split('\n').filter(f => f);
|
|
|
|
for (const file of files) {
|
|
// Get the absolute path, then extract directory
|
|
const absPath = path.join(workingDir, file);
|
|
let dir = path.dirname(absPath);
|
|
|
|
// Add all parent directories up to (but not including) the working dir
|
|
while (dir.length > workingDir.length && dir.startsWith(workingDir)) {
|
|
folders.add(dir);
|
|
dir = path.dirname(dir);
|
|
}
|
|
}
|
|
} catch (error) {
|
|
console.error('Warning: git ls-files failed, falling back to directory walk');
|
|
// Fallback: walk directories but skip common ignored patterns
|
|
walkDirectoriesWithIgnore(workingDir, folders);
|
|
}
|
|
|
|
return folders;
|
|
}
|
|
|
|
/**
|
|
* Fallback directory walker that skips common ignored patterns
|
|
*/
|
|
function walkDirectoriesWithIgnore(dir: string, folders: Set<string>, depth: number = 0): void {
|
|
if (depth > 10) return; // Prevent infinite recursion
|
|
|
|
const ignorePatterns = [
|
|
'node_modules', '.git', '.next', 'dist', 'build', '.cache',
|
|
'__pycache__', '.venv', 'venv', '.idea', '.vscode', 'coverage',
|
|
'.claude-mem', '.open-next', '.turbo'
|
|
];
|
|
|
|
try {
|
|
const entries = readdirSync(dir, { withFileTypes: true });
|
|
for (const entry of entries) {
|
|
if (!entry.isDirectory()) continue;
|
|
if (ignorePatterns.includes(entry.name)) continue;
|
|
if (entry.name.startsWith('.') && entry.name !== '.claude') continue;
|
|
|
|
const fullPath = path.join(dir, entry.name);
|
|
folders.add(fullPath);
|
|
walkDirectoriesWithIgnore(fullPath, folders, depth + 1);
|
|
}
|
|
} catch {
|
|
// Ignore permission errors
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Check if an observation has any files that are direct children of the folder
|
|
*/
|
|
function hasDirectChildFile(obs: ObservationRow, folderPath: string): boolean {
|
|
const checkFiles = (filesJson: string | null): boolean => {
|
|
if (!filesJson) return false;
|
|
try {
|
|
const files = JSON.parse(filesJson);
|
|
if (Array.isArray(files)) {
|
|
return files.some(f => isDirectChild(f, folderPath));
|
|
}
|
|
} catch {}
|
|
return false;
|
|
};
|
|
|
|
return checkFiles(obs.files_modified) || checkFiles(obs.files_read);
|
|
}
|
|
|
|
/**
|
|
* Query observations for a specific folder
|
|
* folderPath is a relative path from the project root (e.g., "src/services")
|
|
* Only returns observations with files directly in the folder (not in subfolders)
|
|
*/
|
|
function findObservationsByFolder(db: Database, relativeFolderPath: string, project: string, limit: number): ObservationRow[] {
|
|
// Query more results than needed since we'll filter some out
|
|
const queryLimit = limit * 3;
|
|
|
|
const sql = `
|
|
SELECT o.*, o.discovery_tokens
|
|
FROM observations o
|
|
WHERE o.project = ?
|
|
AND (o.files_modified LIKE ? OR o.files_read LIKE ?)
|
|
ORDER BY o.created_at_epoch DESC
|
|
LIMIT ?
|
|
`;
|
|
|
|
// Files in DB are stored as relative paths like "src/services/foo.ts"
|
|
// Match any file that starts with this folder path (we'll filter to direct children below)
|
|
const likePattern = `%"${relativeFolderPath}/%`;
|
|
const allMatches = db.prepare(sql).all(project, likePattern, likePattern, queryLimit) as ObservationRow[];
|
|
|
|
// Filter to only observations with direct child files (not in subfolders)
|
|
return allMatches.filter(obs => hasDirectChildFile(obs, relativeFolderPath)).slice(0, limit);
|
|
}
|
|
|
|
/**
|
|
* Extract relevant file from an observation for display
|
|
* Only returns files that are direct children of the folder (not in subfolders)
|
|
* @param obs - The observation row
|
|
* @param relativeFolder - Relative folder path (e.g., "src/services")
|
|
*/
|
|
function extractRelevantFile(obs: ObservationRow, relativeFolder: string): string {
|
|
// Try files_modified first - only direct children
|
|
if (obs.files_modified) {
|
|
try {
|
|
const modified = JSON.parse(obs.files_modified);
|
|
if (Array.isArray(modified) && modified.length > 0) {
|
|
for (const file of modified) {
|
|
if (isDirectChild(file, relativeFolder)) {
|
|
// Get just the filename (no path since it's a direct child)
|
|
return path.basename(file);
|
|
}
|
|
}
|
|
}
|
|
} catch {}
|
|
}
|
|
|
|
// Fall back to files_read - only direct children
|
|
if (obs.files_read) {
|
|
try {
|
|
const read = JSON.parse(obs.files_read);
|
|
if (Array.isArray(read) && read.length > 0) {
|
|
for (const file of read) {
|
|
if (isDirectChild(file, relativeFolder)) {
|
|
return path.basename(file);
|
|
}
|
|
}
|
|
}
|
|
} catch {}
|
|
}
|
|
|
|
return 'General';
|
|
}
|
|
|
|
/**
|
|
* Format observations for CLAUDE.md content
|
|
*/
|
|
function formatObservationsForClaudeMd(observations: ObservationRow[], folderPath: string): string {
|
|
const lines: string[] = [];
|
|
lines.push('# Recent Activity');
|
|
lines.push('');
|
|
|
|
if (observations.length === 0) {
|
|
return '';
|
|
}
|
|
|
|
const byDate = groupByDate(observations, obs => obs.created_at);
|
|
|
|
for (const [day, dayObs] of byDate) {
|
|
lines.push(`### ${day}`);
|
|
lines.push('');
|
|
|
|
const byFile = new Map<string, ObservationRow[]>();
|
|
for (const obs of dayObs) {
|
|
const file = extractRelevantFile(obs, folderPath);
|
|
if (!byFile.has(file)) byFile.set(file, []);
|
|
byFile.get(file)!.push(obs);
|
|
}
|
|
|
|
for (const [file, fileObs] of byFile) {
|
|
lines.push(`**${file}**`);
|
|
lines.push('| ID | Time | T | Title | Read |');
|
|
lines.push('|----|------|---|-------|------|');
|
|
|
|
let lastTime = '';
|
|
for (const obs of fileObs) {
|
|
const time = formatTime(obs.created_at_epoch);
|
|
const timeDisplay = time === lastTime ? '"' : time;
|
|
lastTime = time;
|
|
|
|
const icon = getTypeIcon(obs.type);
|
|
const title = obs.title || 'Untitled';
|
|
const tokens = estimateTokens(obs);
|
|
|
|
lines.push(`| #${obs.id} | ${timeDisplay} | ${icon} | ${title} | ~${tokens} |`);
|
|
}
|
|
|
|
lines.push('');
|
|
}
|
|
}
|
|
|
|
return lines.join('\n').trim();
|
|
}
|
|
|
|
|
|
/**
|
|
* Write CLAUDE.md file with tagged content preservation
|
|
* Note: For the CLI regenerate tool, we DO create directories since the user
|
|
* explicitly requested regeneration. This differs from the runtime behavior
|
|
* which only writes to existing folders.
|
|
*/
|
|
function writeClaudeMdToFolderForRegenerate(folderPath: string, newContent: string): void {
|
|
const resolvedPath = path.resolve(folderPath);
|
|
|
|
// Never write inside .git directories — corrupts refs (#1165)
|
|
if (resolvedPath.includes('/.git/') || resolvedPath.includes('\\.git\\') || resolvedPath.endsWith('/.git') || resolvedPath.endsWith('\\.git')) return;
|
|
|
|
const claudeMdPath = path.join(folderPath, 'CLAUDE.md');
|
|
const tempFile = `${claudeMdPath}.tmp`;
|
|
|
|
// For regenerate CLI, we create the folder if needed
|
|
mkdirSync(folderPath, { recursive: true });
|
|
|
|
// Read existing content if file exists
|
|
let existingContent = '';
|
|
if (existsSync(claudeMdPath)) {
|
|
existingContent = readFileSync(claudeMdPath, 'utf-8');
|
|
}
|
|
|
|
// Use shared utility to preserve user content outside tags
|
|
const finalContent = replaceTaggedContent(existingContent, newContent);
|
|
|
|
// Atomic write: temp file + rename
|
|
writeFileSync(tempFile, finalContent);
|
|
renameSync(tempFile, claudeMdPath);
|
|
}
|
|
|
|
/**
|
|
* Clean up auto-generated CLAUDE.md files
|
|
*
|
|
* For each file with <claude-mem-context> tags:
|
|
* - Strip the tagged section
|
|
* - If empty after stripping → delete the file
|
|
* - If has remaining content → save the stripped version
|
|
*/
|
|
function cleanupAutoGeneratedFiles(workingDir: string, dryRun: boolean): void {
|
|
console.log('=== CLAUDE.md Cleanup Mode ===\n');
|
|
console.log(`Scanning ${workingDir} for CLAUDE.md files with auto-generated content...\n`);
|
|
|
|
const filesToProcess: string[] = [];
|
|
|
|
// Walk directories to find CLAUDE.md files
|
|
function walkForClaudeMd(dir: string): void {
|
|
const ignorePatterns = ['node_modules', '.git', '.next', 'dist', 'build'];
|
|
|
|
try {
|
|
const entries = readdirSync(dir, { withFileTypes: true });
|
|
for (const entry of entries) {
|
|
const fullPath = path.join(dir, entry.name);
|
|
|
|
if (entry.isDirectory()) {
|
|
if (!ignorePatterns.includes(entry.name)) {
|
|
walkForClaudeMd(fullPath);
|
|
}
|
|
} else if (entry.name === 'CLAUDE.md') {
|
|
// Check if file contains auto-generated content
|
|
try {
|
|
const content = readFileSync(fullPath, 'utf-8');
|
|
if (content.includes('<claude-mem-context>')) {
|
|
filesToProcess.push(fullPath);
|
|
}
|
|
} catch {
|
|
// Skip files we can't read
|
|
}
|
|
}
|
|
}
|
|
} catch {
|
|
// Ignore permission errors
|
|
}
|
|
}
|
|
|
|
walkForClaudeMd(workingDir);
|
|
|
|
if (filesToProcess.length === 0) {
|
|
console.log('No CLAUDE.md files with auto-generated content found.');
|
|
return;
|
|
}
|
|
|
|
console.log(`Found ${filesToProcess.length} CLAUDE.md files with auto-generated content:\n`);
|
|
|
|
let deletedCount = 0;
|
|
let cleanedCount = 0;
|
|
let errorCount = 0;
|
|
|
|
for (const file of filesToProcess) {
|
|
const relativePath = path.relative(workingDir, file);
|
|
|
|
try {
|
|
const content = readFileSync(file, 'utf-8');
|
|
|
|
// Strip the claude-mem-context tagged section
|
|
const stripped = content.replace(/<claude-mem-context>[\s\S]*?<\/claude-mem-context>/g, '').trim();
|
|
|
|
if (stripped === '') {
|
|
// Empty after stripping → delete
|
|
if (dryRun) {
|
|
console.log(` [DRY-RUN] Would delete (empty): ${relativePath}`);
|
|
} else {
|
|
unlinkSync(file);
|
|
console.log(` Deleted (empty): ${relativePath}`);
|
|
}
|
|
deletedCount++;
|
|
} else {
|
|
// Has content → write stripped version
|
|
if (dryRun) {
|
|
console.log(` [DRY-RUN] Would clean: ${relativePath}`);
|
|
} else {
|
|
writeFileSync(file, stripped);
|
|
console.log(` Cleaned: ${relativePath}`);
|
|
}
|
|
cleanedCount++;
|
|
}
|
|
} catch (error) {
|
|
console.error(` Error processing ${relativePath}: ${error}`);
|
|
errorCount++;
|
|
}
|
|
}
|
|
|
|
console.log('\n=== Summary ===');
|
|
console.log(`Deleted (empty): ${deletedCount}`);
|
|
console.log(`Cleaned: ${cleanedCount}`);
|
|
console.log(`Errors: ${errorCount}`);
|
|
|
|
if (dryRun) {
|
|
console.log('\nRun without --dry-run to actually process files.');
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Regenerate CLAUDE.md for a single folder
|
|
* @param absoluteFolder - Absolute path for writing files
|
|
* @param relativeFolder - Relative path for DB queries (matches storage format)
|
|
*/
|
|
function regenerateFolder(
|
|
db: Database,
|
|
absoluteFolder: string,
|
|
relativeFolder: string,
|
|
project: string,
|
|
dryRun: boolean
|
|
): { success: boolean; observationCount: number; error?: string } {
|
|
try {
|
|
// Query using relative path (matches DB storage format)
|
|
const observations = findObservationsByFolder(db, relativeFolder, project, OBSERVATION_LIMIT);
|
|
|
|
if (observations.length === 0) {
|
|
return { success: false, observationCount: 0, error: 'No observations for folder' };
|
|
}
|
|
|
|
if (dryRun) {
|
|
return { success: true, observationCount: observations.length };
|
|
}
|
|
|
|
// Format using relative path for display, write to absolute path
|
|
const formatted = formatObservationsForClaudeMd(observations, relativeFolder);
|
|
writeClaudeMdToFolderForRegenerate(absoluteFolder, formatted);
|
|
|
|
return { success: true, observationCount: observations.length };
|
|
} catch (error) {
|
|
return { success: false, observationCount: 0, error: String(error) };
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Main function
|
|
*/
|
|
async function main() {
|
|
const args = process.argv.slice(2);
|
|
const dryRun = args.includes('--dry-run');
|
|
const cleanMode = args.includes('--clean');
|
|
|
|
const workingDir = process.cwd();
|
|
|
|
// Handle cleanup mode
|
|
if (cleanMode) {
|
|
cleanupAutoGeneratedFiles(workingDir, dryRun);
|
|
return;
|
|
}
|
|
|
|
console.log('=== CLAUDE.md Regeneration Script ===\n');
|
|
console.log(`Working directory: ${workingDir}`);
|
|
|
|
// Determine project identifier (matches how hooks determine project - uses folder name)
|
|
const project = path.basename(workingDir);
|
|
console.log(`Project: ${project}\n`);
|
|
|
|
// Get tracked folders using git ls-files
|
|
console.log('Discovering folders (using git ls-files to respect .gitignore)...');
|
|
const trackedFolders = getTrackedFolders(workingDir);
|
|
|
|
if (trackedFolders.size === 0) {
|
|
console.log('No folders found in project.');
|
|
process.exit(0);
|
|
}
|
|
|
|
console.log(`Found ${trackedFolders.size} folders in project.\n`);
|
|
|
|
// Open database
|
|
if (!existsSync(DB_PATH)) {
|
|
console.log('Database not found. No observations to process.');
|
|
process.exit(0);
|
|
}
|
|
|
|
console.log('Opening database...');
|
|
const db = new Database(DB_PATH, { readonly: true, create: false });
|
|
|
|
if (dryRun) {
|
|
console.log('[DRY RUN] Would regenerate the following folders:\n');
|
|
}
|
|
|
|
// Process each folder
|
|
let successCount = 0;
|
|
let skipCount = 0;
|
|
let errorCount = 0;
|
|
|
|
const foldersArray = Array.from(trackedFolders).sort();
|
|
|
|
for (let i = 0; i < foldersArray.length; i++) {
|
|
const absoluteFolder = foldersArray[i];
|
|
const progress = `[${i + 1}/${foldersArray.length}]`;
|
|
const relativeFolder = path.relative(workingDir, absoluteFolder);
|
|
|
|
if (dryRun) {
|
|
// Query using relative path (matches DB storage format)
|
|
const observations = findObservationsByFolder(db, relativeFolder, project, OBSERVATION_LIMIT);
|
|
if (observations.length > 0) {
|
|
console.log(`${progress} ${relativeFolder} (${observations.length} obs)`);
|
|
successCount++;
|
|
} else {
|
|
skipCount++;
|
|
}
|
|
continue;
|
|
}
|
|
|
|
const result = regenerateFolder(db, absoluteFolder, relativeFolder, project, dryRun);
|
|
|
|
if (result.success) {
|
|
console.log(`${progress} ${relativeFolder} - ${result.observationCount} obs`);
|
|
successCount++;
|
|
} else if (result.error?.includes('No observations')) {
|
|
skipCount++;
|
|
} else {
|
|
console.log(`${progress} ${relativeFolder} - ERROR: ${result.error}`);
|
|
errorCount++;
|
|
}
|
|
}
|
|
|
|
db.close();
|
|
|
|
// Summary
|
|
console.log('\n=== Summary ===');
|
|
console.log(`Total folders scanned: ${foldersArray.length}`);
|
|
console.log(`With observations: ${successCount}`);
|
|
console.log(`No observations: ${skipCount}`);
|
|
console.log(`Errors: ${errorCount}`);
|
|
|
|
if (dryRun) {
|
|
console.log('\nRun without --dry-run to actually regenerate files.');
|
|
}
|
|
}
|
|
|
|
main().catch(error => {
|
|
console.error('Fatal error:', error);
|
|
process.exit(1);
|
|
});
|