Database, filesystem, routing, cron, debugger — one engine. Every knob exposed to your agent via MCP. Drop a Kanban card, ship a feature.
DieselEngine is a single-binary backend that bundles PostgreSQL-backed tables, a virtual filesystem, HTTP + WebSocket routing, a GraalVM JavaScript runtime, cron jobs, and a Chrome-DevTools debugger.
Then it does the thing nobody else does: it exposes every
one of those capabilities as a Model Context Protocol tool. file_write,
ingress_create, database_query,
debugger_step_over — all callable by any LLM.
Humans write scripts and design systems. Agents ship features. They share the same filesystem, the same database, the same running process — no handoff, no glue.
Drop a Kanban card. Get a deployed endpoint.
It is not a BaaS. It is not a low-code builder. It is not an IDE plugin. It's the runtime your agent works inside.
60+ tools cover files, trees, SQL, ingress, scripts, debugger, cron, and kanban. Any MCP-capable LLM operates the whole platform — no custom SDKs, no glue code.
Flip a column into agent mode, give it a system prompt, and every card dropped there becomes a task. An agent picks it up, runs tools, commits, and moves the card to Done. You watch it happen live.
Postgres pool, virtual FS, JS runtime, and the autonomous agent loop share one JVM. No cold starts. No edge-function latency. No vendor dashboard.
One virtual filesystem, visible via REST, MCP, and a built-in WebDAV server. Mount it in Finder, Explorer, or VSCode and edit your scripts like any other folder — agents and humans share the same tree.
ES2023 modules on GraalVM with a Chrome DevTools Protocol adapter. Breakpoints, step-in, step-out, eval — all driven by MCP. Agents can debug the code they just wrote.
Your data, your runtime, your agents — on your own hardware. MCP live at
/mcp, bearer-token gated, ready for any OpenAI-compatible
client. No vendor lock-in, no egress fees, no surprises.
This is what actually happens when you drop a card into a Diesel column wired to an agent. Autoplays once, then it's yours — drag any card into Working.
Nobody else wires the whole backend to the LLM. Not Supabase. Not Retool. Not n8n. Not Replit Agent. Not Firebase.
| Capability | diesel.rocks | Supabase | Retool | n8n | Replit Agent | Firebase |
|---|---|---|---|---|---|---|
| MCP-native admin surface (every capability = an LLM tool) | ✓ | — | — | — | ~ | — |
| Self-hosted, single binary | ✓ | ~ | ~ | ✓ | — | — |
| Built-in autonomous agent loop (server-side, persistent) | ✓ | — | — | — | ~ | — |
| Kanban cards trigger agent runs on drop | ✓ | — | — | — | — | — |
| Real JS runtime with Chrome DevTools debugger | ✓ | — | — | ~ | ✓ | — |
| PostgreSQL + JSONB + raw SQL in same process | ✓ | ✓ | — | — | — | — |
| Built-in WebDAV — mount the filesystem in your editor | ✓ | — | — | — | — | — |
| Native WebSocket rooms with per-connection JS context | ✓ | ~ | — | — | — | ~ |
| Cron + startup scripts in the same runtime | ✓ | ~ | — | ✓ | — | — |
| No cold starts, no vendor lock-in | ✓ | ~ | — | ✓ | — | — |
Supabase gives you Postgres + edge functions. Diesel gives you Postgres, a filesystem, in-process JS, cron, an agent loop, and — the thing that matters — an LLM-drivable control plane.
n8n draws boxes between nodes. Diesel runs your ES-module JavaScript next to your database, with a real debugger, and lets an agent rewire the whole graph through MCP.
Replit's agent edits files in an IDE. Diesel's agent edits files in a live, running server — changes go on-air instantly. The server is the workspace.
{ "name": "ingress_create", "arguments": { "type": "HTTP_GET", "route": "/api/hello", "script_uuid": "a1b2c3d4-…" } } // → 200 OK, route is live at http://localhost/backend/api/hello
Tell us what you want to build. We'll spin one up for you — wired, mounted, and ready for your agents.