Open Source Apache 2.0

Fabr

Fabricate Agent Behavior and Reasoning

An open-source .NET framework for building AI agent systems. Agent lifecycle management, chat persistence, tool resolution, and multi-model orchestration — designed as a first-class .NET library using the patterns .NET developers already know.

.NET 10 Orleans Microsoft.Extensions.AI OpenTelemetry
// Define an agent with Fabr
public class MyAgent : FabrAgentProxy
{
    private AIAgent? _agent;
    private AgentThread? _thread;

    public override async Task OnInitialize()
    {
        var client = await GetChatClient("AzureOpenAI");
        _agent = new ChatClientAgent(client)
            .AsBuilder()
            .UseOpenTelemetry()
            .Build(ServiceProvider);
        _thread = _agent.GetNewThread();
    }

    public override async Task<AgentMessage> OnMessage(
        AgentMessage message)
    {
        var response = message.Response();
        var result = await _agent!.RunAsync(
            message.Message, _thread);
        response.Message =
            string.Join("\r\n", result.Messages);
        return response;
    }
}

The Infrastructure Agents Need

Fabr provides the foundational infrastructure that AI agent applications require — so you can focus on what your agent does, not on the plumbing underneath. Every agent type inherits from the FabrAgentProxy base class and gets the full infrastructure for free.

Fabricate

Intentional creation. Define agent types, configure behaviors, wire up tools — all through familiar .NET patterns like dependency injection and configuration.

Behavior

Complex, autonomous workflows that adapt and respond. Multi-agent routing, workflow planning, pipeline orchestration — real agent behaviors, not just API wrappers.

Reasoning

Multi-step problem solving with structured output, tool selection, context management, and automatic conversation summarization to stay within model limits.

Fabr System Visualization
Fabr Distributed Agent Architecture

Framework Capabilities

Everything your agent application needs, built into the base class

Agent Lifecycle Management

Initialization, health reporting, message processing, and graceful shutdown. OnInitialize() and OnMessage() — implement two methods and get a fully managed agent.

Chat History Persistence

Thread-based message storage with Orleans grain state, buffered writes, and atomic replacement. Conversations survive restarts and are scoped per agent per thread.

Chat History Compaction

Automatic LLM-powered conversation summarization when approaching context window limits. Preserves key context while keeping token usage under control.

Tool Resolution

Automatic discovery of tools from dependency injection with the IFabrPlugin interface. Register plugins once, and any agent can resolve and use them.

Multi-Model Support

Configurable model providers — Azure OpenAI, OpenAI, local models — with named configurations. Agents can use different models for different tasks.

Structured Output

JSON schema extraction for deterministic LLM responses. Define a C# class, get validated structured data back from the model — no manual parsing.

The FabrAgentProxy Pattern

One base class. Full infrastructure. Zero boilerplate.

What You Get for Free

Chat persistence and thread management
Automatic context window compaction
Tool and plugin resolution from DI
Health reporting and diagnostics
OpenTelemetry tracing
Orleans grain state management
Structured output extraction

What You Implement

1 OnInitialize() — set up your chat client and tools
2 OnMessage() — decide what to do with each message

Whether it's a simple assistant, a multi-step workflow planner, or an intelligent message router — your agent type just decides what to do with the message. The infrastructure handles the rest.

Built on .NET, For .NET

Not a Python project with .NET bindings. A native .NET framework that uses the patterns you already know.

Dependency Injection

Agents, plugins, and services all register through Microsoft.Extensions.DependencyInjection. Standard .NET service registration — no custom containers or magic.

Configuration

JSON-based agent configuration through Microsoft.Extensions.Configuration. Define agents, their models, plugins, and behaviors in fabr.json.

Orleans

Distributed state management through Orleans grains. Chat history, agent state, and thread management — all persisted with Orleans' built-in reliability.

OpenTelemetry

Built-in tracing and observability. Every agent message, tool call, and LLM interaction is traced — plug into your existing observability stack.

Open Source — MIT License

OpenCaddis

The reference implementation that proves what Fabr can do

A framework without an application is just documentation. OpenCaddis is a full AI agent workspace built on Fabr — a Blazor Server application where you create, configure, and interact with multiple AI agents through a chat interface.

It runs entirely on your machine. Conversations, agent configurations, vector memories — all stored locally in SQLite. The only external calls are to your configured LLM provider.

4 Agent Types
Assistant, Delegate, Workflow, Event Log
10 Plugins
WebBrowser, PowerShell, FileSystem, Memory, and more
Vector Memory
Semantic search with local embeddings
CaddisFly
Pipeline orchestration engine

Get Started

Clone, build, and run in minutes

Fabr Framework

The engine for managing and orchestrating AI agents.

LicenseApache 2.0
Stack.NET 10, Orleans
AI SDKMicrosoft.Extensions.AI
GitHub

OpenCaddis

Full AI agent workspace built on Fabr.

LicenseMIT
Stack.NET 10, Blazor, SQLite
Dockervulcan365/opencaddis
GitHub
# Try OpenCaddis with Docker
docker run -d -p 5000:5000 --name opencaddis vulcan365/opencaddis:latest

# Or clone and build
git clone https://github.com/vulcan365/OpenCaddis.git
cd OpenCaddis/src/OpenCaddis
dotnet run

Need Help Building on Fabr?

Vulcan365 offers consulting and development services for teams building AI agent systems with Fabr and .NET.