Private Beta · Alpha

Code that
thinks together.

GraphBus is a multi-agent orchestration protocol where LLM-powered agents negotiate, refactor, and evolve your codebase — then run it statically at production speed.

hello_graphbus/agents/hello_service.py
from graphbus_core import GraphBusNode, schema_method, subscribe

class HelloService(GraphBusNode):
    SYSTEM_PROMPT = "I generate friendly greeting messages."

    @schema_method(
        input_schema={},
        output_schema={"message": str}
    )
    def generate_message(self):
        return {"message": "Hello from GraphBus!"}

    @subscribe("/Hello/MessageGenerated")
    def on_message(self, event):
        # Agents react to events on the bus
        self.log(event.payload)
2
Execution modes
16
CLI commands
$0
AI cost at runtime
100%
Test coverage (core)

Two modes. One codebase.

GraphBus separates the intelligence of building from the efficiency of running.

🔨
BUILD MODE

Agents Active

Your Python classes become LLM-powered agents. They analyze their own source code, negotiate with each other via the message bus, and collaboratively refactor the entire codebase — reaching consensus before committing any changes.

  • LLM agent per class
  • Proposal / evaluate / commit cycle
  • DAG-based orchestration via networkx
  • Code is mutable — agents can rewrite it
Artifacts
RUNTIME MODE

Agents Dormant

The built artifacts execute as pure, static Python. No LLM calls, no overhead — just fast deterministic execution. The message bus handles pub/sub routing; everything else is zero-cost.

  • No LLM cost at runtime
  • Deterministic, auditable code
  • Pub/sub message routing
  • Code is immutable — frozen for production
01

Define your agents

Subclass GraphBusNode. Write your business logic. Add a system prompt. That's it — each class is now a first-class agent on the bus.

02

Run the build

GraphBus scans your modules, builds a dependency graph with networkx, activates one LLM agent per class, and orchestrates a negotiation round. Agents propose improvements, vote, and commit changes to source.

03

Deploy the artifacts

The build emits clean JSON artifacts — graph, agents, topics, schemas. Runtime loads these and executes your (now-improved) code with zero AI overhead.

04

Evolve over time

Your codebase improves with every build cycle. Agents negotiate schema contracts, refactor for coherence, and adapt to changing requirements — collaboratively.

Everything you need to orchestrate at scale.

🕸️

Graph-native

Every agent is a node. Every dependency is an edge. The entire system is a live, traversable networkx DAG — built for topological reasoning.

📨

Message Bus

Typed pub/sub messaging across agents. Topics, subscriptions, and event routing baked into the protocol — no external broker required.

🤝

Agent Negotiation

Agents propose code changes, evaluate each other's proposals, and vote. An arbiter resolves conflicts. Consensus drives commits.

🔒

Schema Contracts

Method-level input/output schemas define the contract between agents. The build validates every edge in the graph before emitting artifacts.

🖥️

CLI + TUI

16 production-grade CLI commands. An interactive TUI for build, runtime, deploy, and profiling — keyboard-driven, no mouse required.

🚀

Deploy anywhere

Built-in Docker, Kubernetes, and CI/CD tooling. Generate Dockerfiles, K8s manifests, GitHub Actions — from the CLI in seconds.

📊

Observability

Prometheus metrics, real-time dashboards, event timelines, and health monitoring. Know exactly what your agents are doing.

🧪

Test-first

100% passing tests on Runtime Core, CLI, and deployment tooling. pytest-native. Coverage reporting built in. Ship with confidence.

GraphBus vs. the alternatives.

Other frameworks run agents. GraphBus makes agents collaboratively write and evolve the code itself.

Capability GraphBus LangGraph CrewAI AutoGen
Agents rewrite source code ✓ Yes ✗ No ✗ No ⚬ Limited
Zero LLM cost at runtime ✓ Always ✗ Every call ✗ Every call ✗ Every call
Agent negotiation / consensus ✓ Built-in ✗ No ⚬ Partial ⚬ Partial
Graph-native DAG orchestration ✓ networkx ✓ Yes ✗ No ✗ No
Typed schema contracts per edge ✓ Yes ⚬ Partial ✗ No ✗ No
Build / Runtime mode separation ✓ Core design ✗ No ✗ No ✗ No
Pure Python, no vendor lock-in ✓ Yes ✓ Yes ✓ Yes ✓ Yes
Built-in K8s / Docker deploy ✓ CLI native ✗ No ✗ No ✗ No

Not a framework. A protocol.

GraphBus defines how intelligent agents communicate, negotiate, and evolve — a lingua franca for agent-driven code orchestration.

⚖️
Arbiter
Resolves conflicts
GRAPHBUS
proposal
evaluation
commit
event
🤖
ServiceA
🤖
ServiceB
🤖
ServiceC

Up in 60 seconds.

1
Install
pip install graphbus-core
2
Init a project
graphbus init my-project --template microservices
cd my-project
3
Build + run
graphbus build agents/
graphbus run .graphbus
4
Enable LLM agents (optional)
export ANTHROPIC_API_KEY=sk-...
graphbus build agents/ --enable-agents

Join the waitlist

GraphBus is in alpha. We're onboarding early adopters who want to shape the protocol. Drop your email — we'll reach out when we're ready for you.

No spam. Just launch news and protocol updates.

Already have questions? Email us directly →