Give any AI instant knowledge of your entire codebase — hybrid semantic search, dependency graphs, and infra context. Private. Local. Free.
Only Docker required · Works with Claude, Cursor, Windsurf, Cline
Tested on VS Code's 2.45M-line codebase with Claude Opus 4.6
| Question | Grep (bytes) | SocratiCode | Reduction | Speedup |
|---|---|---|---|---|
| VS Code workspace trust restrictions | 56,383 | 21,149 | 62.5% | 49.7× |
| Diff editor text differences | 37,650 | 15,961 | 57.6% | 40.2× |
| Extension activation & lifecycle | 36,231 | 16,181 | 55.3% | 34.4× |
| Integrated terminal shell management | 50,159 | 22,518 | 55.1% | 31.1× |
| Command palette & quick pick | 70,087 | 20,676 | 70.5% | 31.7× |
| TOTAL | 250,510 | 96,485 | 61.5% | 37.2× |
Deep codebase intelligence — no bloat, no setup, fully automatic.
Dense vector + BM25 lexical, fused with RRF. Semantic queries and exact identifier lookups, both handled perfectly.
AST-aware static analysis for 18+ languages. Circular dependency detection with Mermaid visualization.
Add to MCP config, done. Docker handles Qdrant + Ollama automatically. No YAML, no env vars, no native deps.
Everything runs on your machine. Code never leaves your network. Air-gap ready.
Battle-tested on 40M+ line codebases. Resumable batched indexing, live file watching, multi-agent coordination.
Ollama (local), OpenAI, Google Gemini. Switch with one env var.
Only Docker required.
Then ask your AI: "Index this project"
Docker auto-starts Qdrant + Ollama
AST chunking + hybrid embeddings
Semantic search, graph queries, context artifacts
Experience SocratiCode's search capabilities in real-time.
Explore the code structure visually.
Watch the performance difference in real-time.