If you back founders early, let's talk.
We're not running a process. We're building. But we're listening — and if there's a fit, we want to know early. Read the thesis below, then drop us a note.
The next $10B knowledge tool will be built for AI, not retrofitted for it.
Every knowledge worker who pays for Claude opens it with the same problem: their notes weren't built for AI. They burn 500–2,000 tokens per session re-explaining context. 100 sessions a month is 50K–200K wasted tokens. Multiply across 20M+ paying Claude users and the spend is enormous — and entirely solvable with the right substrate.
Obsidian was built for humans writing markdown in 2020. Notion was built for collaborative docs in 2016. Neither was designed with AI as a first-class consumer of the data. Marje is. Bundled MCP server, two-tier factory configuration, hybrid storage, cascading folder rules, and a curated module marketplace.
We're attacking from the consumer side first — "can my mom use this" is the test. Enterprise governance follows the same architecture (the factory layer already enforces it). The early access waitlist for Enterprise is open and shaping the rollout.
Six things that make this hard to replicate.
Two-tier factory configuration
A hidden, AES-256-GCM encrypted layer that controls routing, permissions, and rendering. Users customize the surface; the system stays governed. The factory layer is what protects against prompt injection, conversational probing, and competitive reverse engineering.
MCP-native vault
A built-in Model Context Protocol server with 50+ purpose-built tools. The vault was designed for Claude from day one — not retrofitted with a plugin. First-mover position on AI-native local-first knowledge management.
Token-efficiency moat
Architectural pre-loading of context eliminates the 50K–200K monthly tokens that a typical Claude+Obsidian user wastes re-establishing context. The vault pays for itself in saved API spend. Architecturally hard for competitors to replicate without rebuilding from scratch.
Patent Pending IP
Provisional patent application filed covering the two-tier factory/user configuration architecture and the cascading rule engine. Trademark filing in progress. AES-256-GCM encrypted factory rules + native binary distribution make piracy and reverse engineering prohibitively expensive.
Module marketplace
Folders that act like apps — Client List, Job Estimates, Podcast Tracker. Three categories: Workspaces, Intelligence, Data Forms. Curated, signed packages distributed via Cloudflare R2 with per-user entitlement tokens. Two-sided market potential, currently first-party only.
Zero variable AI cost
Users bring their own Claude (or OpenAI, or local Ollama) subscription. Marje captures 100% of license and module revenue with no API cost-of-goods. Infrastructure cost at launch is effectively $0 on Cloudflare free tiers.
$0 variable cost. Multiple revenue streams.
Where we are right now.
- Product — Beta on Windows. Daily build, single founder, full vertical product (Svelte frontend, Rust backend, MCP server).
- Platforms — Windows at launch. Mac within 60 days. Linux on the roadmap.
- IP — Provisional patent filed. Trademark in progress.
- Stack — Tauri v2, Svelte 5, Rust. Cloudflare for licensing + module distribution (R2, Workers, D1).
- Distribution — Direct download at launch. Module store runs on Stripe Checkout with signed entitlement tokens.
- Costs at launch — ~$0/mo on Cloudflare free tiers. Code signing + distribution only.
- Enterprise — Foundation shipping (audit, RBAC, approvals). Centralized policy + SSO in early access.
- Founder — Kevin Hibbard. Owosso, Michigan. Inventor, full-stack developer, serial entrepreneur. Multiple operating businesses under Hibb Co. LLC.
Open a conversation.
Tell us about you. We're not running a process — we're listening. If there's a fit, we'll move fast. If there isn't, we'll tell you straight.
You can also email [email protected] directly.