A native .NET property-graph database with Cypher, a full Graph Data Science algorithm suite, window functions, native Model Context Protocol servers, and a planner-integrated extension surface for custom indexes. Apache 2.0 licensed.
using BogDB.Core.Main;
using var db = BogDatabase.CreateInMemory();
using var conn = new BogConnection(db);
conn.Query("CREATE NODE TABLE Person (id INT64, name STRING, PRIMARY KEY(id))");
conn.Query("CREATE REL TABLE KNOWS (FROM Person TO Person)");
conn.BeginWriteTransaction();
conn.Query("CREATE (:Person {id:1, name:'Alice'})-[:KNOWS]->(:Person {id:2, name:'Bob'})");
conn.Commit();
var r = conn.Query("MATCH (a)-[:KNOWS]->(b) RETURN a.name, b.name");The graph-database landscape circa 2026 has a hole in it. Neo4j is JVM-first and copyleft. DuckDB does columns, not graphs. Upstream Kùzu — the embeddable, permissively-licensed graph database many of us were standardizing on — was acquired into a private roadmap, with the public project archived after its last 0.11.x release.
BogDB is the answer for everybody else: an actively-maintained, embeddable, .NET-native graph database under a permissive Apache 2.0 license. It started as a test-by-test parity port of the Kùzu C++ engine and now ships capabilities the original never had — full-frame-clause window functions, an expanded Graph Data Science algorithm suite, native Model Context Protocol servers, and a planner-integrated extension surface that lets you plug custom indexes into query optimization without modifying or forking the engine.
BogDB is built by Beyond Ordinary Software Solutions and lives at github.com/BeyondOrdinary/BogDB.
Three intersecting capabilities, all in one embeddable .NET assembly.
Property-graph data model, ANTLR4 Cypher parser, planner with all eleven optimizer rules at upstream parity, 226 scalar functions, 13 window functions with full ROWS / RANGE / GROUPS frame clauses, and ten Graph Data Science algorithms (PageRank, WCC, SSSP, K-Hop, VLP, Louvain, SCC, K-Core, Span Forest). Disk-backed columnar storage, MVCC, WAL with page-level recovery.
Native Model Context Protocol servers ship with the engine. Vector and LLM extensions are built in. Natural-language Cypher generation runs through the codegen MCP server. BogDB is the graph layer Claude, GPT, and Llama agents can query directly — no adapter required, no glue code to maintain.
A planner-integrated IExternalIndexProvider contract lets you plug entire access methods into BogDB's optimizer the same way Postgres lets you add custom access methods. Vector indexes, full-text engines, federated lookups, and the commercial SecureSearch provider all participate in planning and execution without forking the core.
BogDB is validated test-by-test against the upstream C++ engine on every supported surface. The receipts:
| Surface | Status |
|---|---|
| Test suite | 1,803 passing · 0 failing · 0 skipped in the Apache 2.0 migration slice |
| Golden query corpora | 52 / 52 green · 714 named golden queries |
| Optimizer rules | 11 of 11 (100%) — TopK, FilterPushDown, LimitPushDown, AccHashJoin, JoinOrder, … |
| Scalar functions | 226 / 226 at C++ parity |
| Window functions | 13 functions · full ROWS / RANGE / GROUPS frame clauses, all five bound types |
| GDS algorithms | 10 algorithms via CALL algo() YIELD * |
| Extension boundary | 10 public extension projects · SecureSearch held back as a commercial provider |
| Storage | Disk-backed columnar · MVCC · WAL with page-level recovery |
Most graph databases that claim "extensibility" mean you can register a function. BogDB lets you register an entire access method. Implementations of IExternalIndexProvider participate in BogDB's planner and execution at the same level as built-in scans — the engine never has to know what index technology is behind them. The result is a clean separation: the engine handles parsing, planning, joins, projections, aggregation, and execution; the extension handles its own index, its own lookup semantics, and its own storage.
IExternalIndexProviderThe contract an extension implements to expose a custom index. The provider advertises which predicate shapes it can satisfy, returns a cost estimate for the planner, and yields physical scan execution on demand. Register one provider for an external vector store, another for an FTS engine, a third for commercial HMAC-tokenized secure search — they coexist and the planner picks the best match per query.
ExternalPredicatePlannerHookA planner-time hook that runs during predicate placement. Registered providers inspect each predicate and may claim it — telling the optimizer "I can satisfy this faster than a full scan." Once claimed, the planner routes the predicate into an ExternalIndexScan instead of a default ScanNodeProperty plus PhysicalFilter pipeline. No core changes needed to onboard a new index technology.
ExternalIndexScanThe physical operator that delegates scan execution to the registered provider. Yields node or relationship IDs that match the claimed predicate, which then flow through the normal pipeline (joins, projections, aggregation, top-K) unchanged. Your extension only writes the part that's unique to your index; everything downstream is BogDB's existing machinery.
ExtensionLookupScanA point-lookup specialization for predicate shapes that resolve to a single match or a small candidate set — secure equality lookups, hash-based dedup, primary-key probes against an external store. The planner picks this operator over ExternalIndexScan when an extension declares a predicate as a point lookup, so plans optimize for cardinality-one access instead of range scans.
This is the same architecture pattern Postgres pioneered with custom access methods. Anything that can describe a predicate shape and produce IDs can be a BogDB index — without modifying or forking the core engine.
All in idiomatic .NET, no separate query process or out-of-proc service.
var r = conn.Query("CALL pagerank() YIELD *");
while (r.HasNext()) {
var row = r.GetNext();
Console.WriteLine($"node {row.GetString(0)} → rank {row.GetDouble(1):F6}");
}conn.Query(@"
MATCH (e:Employee)
RETURN e.dept, e.name, e.salary,
ROW_NUMBER() OVER (PARTITION BY e.dept ORDER BY e.salary DESC) AS rn
ORDER BY e.dept, rn");conn.Query(@"
MATCH (a:Person)-[r*1..3]->(b:Person)
WHERE a.name = 'Alice'
RETURN a.name, length(r), nodes(r), rels(r)");public class MySecureIndex : IExternalIndexProvider
{
public string Name => "secure_lookup";
public bool CanSatisfy(PredicateShape shape) => shape.IsEquality;
public CostEstimate Estimate(PredicateShape shape) => new(rows: 1, cost: 1.0);
public IEnumerable<NodeId> Lookup(PredicateValue value) => _index.Find(value);
}
// Register at startup — planner and executor pick it up automatically
db.RegisterExternalIndexProvider(new MySecureIndex());Each public sample is a working Blazor or console app in the BogDB repository.
A developer tool that converts SQL idioms to Cypher patterns. The fastest path for relational engineers learning graphs.
Benchmarks BogDB's query generation, retrieval, and graph-RAG performance across multiple LLM strategies.
A classic social-network reference: people, follows, post-and-react. The "graph database 101" sample.
NuGet- and npm-style dependency-graph analysis: vulnerabilities, transitive impact, vendor concentration.
A consumer-friendly food / ingredient / nutrient knowledge graph. Demonstrates richer modeling and traversal.
A creature / ability / affinity browser in the classic tabletop-RPG style. Hierarchical and tag-based traversal, made fun.
The engine is free under Apache 2.0. We build the businesses on top: vertical accelerators, a secure-search index provider, FedRAMP-aligned hardening, managed and hosted deployments, and the full menu of support and integration services.
Our vertical accelerators are graph-backed application kits in the markets we know best: financial fraud detection, cyber compliance and CMMC posture, supply-chain risk, regulated healthcare, defense logistics, and tactical communications analysis. Each is a working Blazor reference application that integrates with your data sources via the BogDB external-index framework.
BogDB is a three-month, AI-first migration of a real C++ database-engine surface to native .NET — no toy, no tutorial reskin, no single-tool experiment. Jacob Anderson led the engineering with four frontier models in the rotation: Claude (Anthropic), Codex (OpenAI), Gemini (Google), and Grok (xAI).
Each one earned its keep — drafting architecture documents, porting operators, generating tests, reasoning about optimizer rules, reviewing diffs, and pushing back when something didn't smell right. AI was on the team. The 1,803 passing tests, the 11-of-11 optimizer rules, and the parity numbers above were earned the old-fashioned way — one regression at a time — by humans and AI working as collaborators, not as autocomplete.
This is also why we sell what we sell. If you want a team that can actually drive frontier models hard enough to ship a database engine, that's what Beyond Ordinary does for a living. The page above tells you how to reach us.
BogDB is licensed under Apache 2.0 — including the patent grant clause. Use it commercially, fork it, build on it, redistribute it. The core engine work originated from a test-by-test parity port of the Kùzu C++ database (MIT, Copyright 2022–2025 Kùzu Inc.); we preserve that attribution and gratefully acknowledge it. All new C# code, the window-function service, the GDS additions, the MCP servers, the external-index provider framework, and the Blazor sample apps are Beyond Ordinary Software Solutions copyright, contributed back to the world under Apache 2.0.