This is a submission for the Notion MCP Challenge
What I Built
Open your Notion workspace right now. Scroll through the pages. How much of it is engineering intelligence? Not meeting notes. Not specs. Actual, structured, queryable intelligence — performance baselines, security audit trails, architectural decision drift, onboarding tracks generated from your real codebase, health scores synthesized across six different dimensions of your repo's vital signs.
For most teams, the answer is: almost none.
And that's strange, because every engineering team generates an enormous amount of signal every single day. Commits, pull requests, dependency updates, benchmark regressions, security vulnerabilities, architectural decisions that slowly drift from reality — all of it flowing through GitHub, all of it generating notifications that nobody reads, all of it evaporating into the void within 48 hours.
ENGRAM exists because I got tired of watching that signal disappear.
The Core Idea
ENGRAM is a self-hosted engineering intelligence platform built entirely in Rust. One binary. You run it, it listens to your GitHub repositories via webhooks, routes every event through 9 specialized AI agents powered by Claude, and writes structured, relational intelligence directly into 23 interconnected Notion databases.
No SaaS dashboard you'll forget to check. No separate Postgres instance to maintain. No sync layer to debug at 2 AM. Your engineering knowledge lives where your team already works — in Notion. Queryable, filterable, shareable, and connected through cross-database relations that turn flat data into a knowledge graph.
The Architecture in One Breath
GitHub Push/PR/Release
|
v
ENGRAM Core (axum + tokio)
|
|---> Decisions Agent --> Notion: RFCs, Comments, Decision Drift
|---> Pulse Agent --> Notion: Benchmarks, Regressions, Baselines
|---> Shield Agent --> Notion: Dependencies, Audit Runs, CVEs
|---> Atlas Agent --> Notion: Modules, Onboarding Tracks, Knowledge Gaps
|---> Vault Agent --> Notion: Env Configs, Secret Rotation
|---> Review Agent --> Notion: PR Reviews, Patterns, Tech Debt
|---> Health Agent --> Notion: Health Scores, Weekly Digests
|---> Timeline Agent --> Notion: Events, Cross-Agent Correlation
|---> Release Agent --> Notion: Release Notes, Changelogs
A single push event can trigger writes across five or more databases simultaneously — a benchmark result, a regression alert, an updated health score, a new timeline entry, and an updated module map. Intelligence compounds with every event.
The 9 Intelligence Layers
Each agent is a focused analyst with a specific domain. Not a monolithic "analyze everything" prompt — nine separate, domain-expert pipelines with their own Notion database schemas, their own Claude prompts, and their own cross-referencing logic.
| # | Layer | What It Does | Notion Databases |
|---|---|---|---|
| 1 | Decisions | Extracts architectural decisions from PRs, tracks RFC lifecycle, flags stale proposals, scores decision drift over time | RFCs, RFC Comments |
| 2 | Pulse | Parses CI benchmark output, maintains rolling performance baselines, detects regressions before they hit production | Benchmarks, Regressions, Performance Baselines |
| 3 | Shield | Runs dependency audits, deduplicates CVEs across runs, classifies severity, auto-creates RFCs for critical vulnerabilities | Dependencies, Audit Runs |
| 4 | Atlas | Maps the codebase into logical modules, generates step-by-step onboarding tracks for new contributors, identifies knowledge gaps | Modules, Onboarding Tracks, Onboarding Steps, Knowledge Gaps |
| 5 | Vault | Diffs environment configs between deployments, tracks secret rotation schedules, alerts when credentials go stale | Env Configs, Secret Rotation Logs |
| 6 | Review | Analyzes PR review comments for recurring patterns, extracts anti-patterns, promotes frequently-flagged issues into tech debt items | PR Reviews, Review Patterns, Tech Debt |
| 7 | Health | Computes composite health scores from commit velocity, merge times, test coverage, and open issues — generates weekly engineering digests | Health Reports, Engineering Digests |
| 8 | Timeline | Builds cross-agent event timelines, correlates changes across all 9 layers, maintains an immutable audit trail | Events |
| 9 | Release | Auto-generates release notes from merged PRs, categorizes changes by type, produces AI readiness assessments and migration notes | Releases, Changelogs |
AI Interpretations — Not Just Data, Analysis
Every data table in the dashboard supports click-to-expand detail rows showing AI-generated analysis stored in Notion. This is not a chatbot you query after the fact. The analysis runs once per event, at ingestion time, and the results live permanently in your Notion workspace.
- Decisions: Decision rationale, drift score with severity tag, drift notes explaining what changed
- Shield: AI triage recommendation per CVE with risk context and remediation priority
- Review: Quality score (0-100) and a complete AI review draft
- Atlas: Full AI summary of each module, key files as code references
- Vault: Sensitivity classification, AI analysis of each config variable's purpose and risk
- Releases: AI readiness assessment, generated release notes, migration notes for breaking changes
- Pulse: Impact analysis per regression, AI recommendation for resolution
- Health: Key risks and key wins extracted from cross-layer synthesis
What Makes This Different
Single binary. One ~15 MB Rust executable. The dashboard is compiled into the binary via rust-embed. The config template is embedded and auto-extracted on first run. The Windows build has the ENGRAM icon baked into the .exe. No Docker. No Node. No Python runtime. Download, run, open your browser.
Zero config files. The setup wizard in the embedded dashboard walks you through everything — Notion integration token, GitHub PAT, Claude API key. No .env files to manage. Everything persists to engram.toml, which the binary generates for you.
Webhook-driven, not polling. ENGRAM receives real-time GitHub events via webhooks with HMAC-SHA256 verification. A push triggers analysis within seconds, not whenever a cron job wakes up.
Notion IS the database. There is no Postgres. No SQLite. No Redis. Every read and write goes through the Notion API. Your data is always in Notion — queryable, shareable, and visible to your entire team without asking them to learn another tool.
Demo mode. For presentations and videos, a demo.js script loads realistic mock data across all 23 databases — excluded from the production binary via rust-embed's #[exclude] directive. Load it with ?demo in the URL.
The Background
I build things in Rust because I believe the tool shapes the thinker. The constraints of ownership, lifetimes, and zero-cost abstractions force you to understand what you're actually building — not just what it does, but how it uses memory, how it handles failure, how it behaves under pressure.
ENGRAM started as a question: what if the intelligence your team generates every day didn't just flow through GitHub notifications and disappear? What if every commit, every PR review, every security audit left a structured trace in a system your team already lives in?
The answer turned out to be nine Rust crates, Claude for the thinking, and Notion for the memory.
The Notion MCP Challenge gave me the constraint I needed. Not "build something that uses Notion" — but "build something where Notion is load-bearing." Where removing Notion doesn't just remove a feature, it removes the entire persistence layer. That constraint produced ENGRAM.
Video Demo
Setup Flow (Under 2 Minutes)
- Run
./engram— server starts onlocalhost:3000 - Open the dashboard — the setup wizard appears automatically
- Paste your Notion integration token — ENGRAM creates all 23 databases with full schemas, relations, and rollup properties
- Paste your GitHub token — configure which repos to track
- Add the webhook URL to your GitHub repo settings
- Paste your Anthropic API key — all 9 agents come online
- Push code — watch Notion fill up with structured intelligence
Show Us the Code
manojpisini
/
engram
ENGRAM is a self-organizing engineering intelligence platform. It connects your GitHub repositories, Notion workspace, and Claude AI into a single autonomous system that continuously analyzes your codebase and writes structured intelligence directly into Notion.
Engineering Intelligence, etched in Notion.
Quick Start · How It Works · Intelligence Layers · Dashboard · Deployment · Security
ENGRAM is a self-organizing engineering intelligence platform. It connects your GitHub repositories, Notion workspace, and Claude AI into a single autonomous system that continuously analyzes your codebase and writes structured intelligence directly into Notion.
No polling. No manual data entry. GitHub webhooks push events to ENGRAM, 9 specialized AI agents interpret them using Claude, and every insight — security audits, performance regressions, architecture maps, RFC lifecycle tracking, team health reports, onboarding documents — is written as structured, queryable, relational data in your Notion workspace.
Notion is the central nervous system. Every metric, every decision, every piece of intelligence lives in 23 interconnected databases in your workspace.
Key Features
-
Single binary — dashboard, config template, and Windows icon all embedded via
rust-embed. Just download and run. - 9 AI…
Tech Stack
| Component | Technology | Why |
|---|---|---|
| Core runtime | Rust, axum, tokio | Single binary, async webhook processing, broadcast channels for agent fan-out |
| AI backbone | Claude API (claude-sonnet-4-20250514) | Powers all 9 analysis agents with structured, domain-specific output |
| Persistence | Notion API via MCP client | Every database operation goes through Notion — no local database, no sync layer |
| Event ingestion | GitHub Webhooks | Real-time push/PR/release events with HMAC-SHA256 signature verification |
| Dashboard | Vanilla HTML/JS, Chart.js | Single-file SPA compiled into the binary — zero build step, zero dependencies |
| Auth | argon2 + JWT | Secure password hashing and token-based sessions |
| Packaging | GitHub Actions | Cross-platform builds: Linux (x86/ARM), macOS (Intel/Apple Silicon), Windows |
| Crate publishing | crates.io |
cargo install engram-core — 11 crates published in dependency order |
Project Structure
engram/
├── crates/
│ ├── engram-core/ Main daemon: axum server, webhook handler,
│ │ ├── src/main.rs event router, scheduler, embedded dashboard
│ │ ├── src/webhook.rs HMAC verification, GitHub event parsing
│ │ └── build.rs Dashboard embedding, Windows icon, config copy
│ ├── engram-types/ Shared types, config, events, Notion schemas
│ ├── engram-decisions/ Layer 1 — RFC lifecycle, drift scoring
│ ├── engram-pulse/ Layer 2 — Benchmark tracking, regression detection
│ ├── engram-shield/ Layer 3 — Security audit, CVE triage
│ ├── engram-atlas/ Layer 4 — Module docs, onboarding, knowledge gaps
│ ├── engram-vault/ Layer 5 — Env config, secret rotation
│ ├── engram-review/ Layer 6 — PR analysis, tech debt, review patterns
│ ├── engram-health/ Layer 7 — Health scoring, weekly digest
│ ├── engram-timeline/ Layer 8 — Event correlation, audit trail
│ └── engram-release/ Layer 9 — Release notes, changelog
├── dashboard/
│ ├── index.html Single-page dashboard (embedded via rust-embed)
│ └── demo.js Mock data for demos (excluded from binary)
├── .github/workflows/
│ ├── release.yml Cross-platform release builds
│ ├── audit.yml Security audit → Shield agent
│ ├── benchmark.yml Benchmarks → Pulse agent
│ └── engram-notify.yml PR events → Review, Decisions, Timeline agents
└── engram.toml.example Config template (embedded, auto-extracted)
The Build System
The build.rs in engram-core does something I'm particularly satisfied with: it copies the workspace-level dashboard/ directory and engram.toml.example into the crate directory at build time, so that rust-embed's #[folder = "dashboard/"] works both in workspace development builds AND inside cargo package tarballs (where ../../dashboard/ doesn't exist). Demo data is excluded from the copy. The same build.rs embeds the ENGRAM icon into the Windows executable via winresource.
One binary. Dashboard included. Config template included. Icon included. Nothing to install, nothing to configure, nothing to forget.
How I Used Notion MCP
This is the section I care about most, because this is where the architecture either stands or falls.
Notion is not a display layer in ENGRAM. It is not an export target. It is not a "nice-to-have integration." Notion is the entire persistence backend. Remove it and ENGRAM has no database. No storage. No state. Every piece of data the system generates — every health score, every CVE triage, every RFC drift calculation, every onboarding step — is written to and read from Notion.
1. Automated Schema Creation — 23 Databases, Zero Manual Setup
When you click "Save & Initialize ENGRAM" in the setup wizard, the system creates 23 databases in your Notion workspace. Not empty databases — fully typed schemas with select properties, multi-select tags, date fields, number columns, URL links, relation properties linking databases to each other, and rollup calculations.
The schema definitions live in engram-types/src/notion_schema.rs. Every database operation is routed through the NotionMcpClient in engram-core/src/notion_client.rs. The code comment at the top of the schema file says it plainly:
Every DB operation must use Notion MCP tools — never the raw Notion REST API.
That's the architectural constraint. One client. One protocol. One source of truth.
The 23 databases, grouped by domain:
| Domain | Databases |
|---|---|
| Projects | Projects |
| Decisions | RFCs, RFC Comments |
| Performance | Benchmarks, Regressions, Performance Baselines |
| Security | Dependencies, Audit Runs |
| Knowledge | Modules, Onboarding Tracks, Onboarding Steps, Knowledge Gaps |
| Config | Env Config, Config Snapshots, Secret Rotation Log |
| Review | PR Reviews, Review Playbook, Review Patterns, Tech Debt |
| Health | Health Reports, Engineering Digest |
| Timeline | Events |
| Release | Releases |
2. Real-Time Intelligence Writes — Event-Driven, Not Batch
When a GitHub webhook fires, ENGRAM's event router broadcasts it to all 9 agents simultaneously via tokio broadcast channels. Each agent:
- Receives the raw GitHub event payload
- Calls Claude with a domain-specific prompt and the relevant data
- Parses Claude's structured response
- Writes one or more pages to the relevant Notion databases
This is not a batch job that runs overnight. A push to main triggers benchmark analysis, regression detection, health score updates, timeline entries, and module map changes — written to Notion within seconds of the event. The cron scheduler handles periodic tasks (daily audits, weekly digests, RFC staleness checks), but the core intelligence loop is webhook-driven and real-time.
3. The Dashboard Reads from Notion — No Local Cache
The ENGRAM dashboard has no local database. When you open it, it queries Notion through ENGRAM's REST API, which reads from Notion in real-time. Health scores, onboarding tracks, dependency audits, PR review patterns — everything is rendered live from Notion data.
This means your team can view the same data in the ENGRAM dashboard OR directly in Notion — filtered, sorted, grouped, and shared however they prefer. The dashboard is a lens. Notion is the source.
4. Cross-Database Relations — The Knowledge Graph
This is what makes the Notion integration more than just "writing to a database." ENGRAM builds a connected knowledge graph inside your Notion workspace:
- RFCs link to PRs — trace an architectural decision to the code that implements it
- Regressions link to Baselines — see the exact performance delta and the commit that caused it
- Onboarding Steps link to Modules — each learning step references the codebase module it teaches
- Tech Debt links to Review Patterns — every debt item traces back to the review pattern that flagged it, with frequency count
- Audit Runs link to Dependencies — vulnerability findings connect to the specific dependency and version
- Timeline Events link to source agents — every event carries attribution to the intelligence layer that generated it
These aren't decorative links. They're queryable relations with rollup properties. You can build Notion views that answer questions like "show me all RFCs that have drifted more than 20% from their original decision" or "which modules have zero onboarding coverage?" — without leaving Notion.
5. What This Architecture Unlocks
Without the Notion-as-database approach, building ENGRAM would have required:
- A PostgreSQL or SQLite database for persistence
- A sync layer to mirror data to Notion for visibility
- Conflict resolution logic for bidirectional sync
- A separate query API for the dashboard
- Migration scripts for schema changes
With Notion as the single persistence layer, all of that disappears. Reads and writes go to one place. The user's engineering intelligence lives where they already work. Schema changes happen in the Notion database properties. There's nothing to sync because there's nothing to sync between.
The tradeoff is real — Notion API latency is higher than a local database, and rate limits matter at scale. But for the problem ENGRAM solves — structured engineering intelligence for teams that already live in Notion — the tradeoff is worth it. Your data is always accessible, always shareable, always where your team expects it.
Built with Rust, Claude, and Notion. One binary. 23 databases. 9 AI agents. Zero config files. The intelligence your team generates every day, structured and preserved in the workspace you already use.





![[P] I trained an AI to play Resident Evil 4 Remake using Behavioral Cloning + LSTM](/_next/image?url=https%3A%2F%2Fexternal-preview.redd.it%2FzgmJOxETuqgqlsgMxeBl7S4gZNDHf_K3U9w883ioT4M.jpeg%3Fwidth%3D320%26crop%3Dsmart%26auto%3Dwebp%26s%3Da63f97b9d03c40b846cd3eaac472e78050020a43&w=3840&q=75)