Open Source Apache 2.0

FabrCore

Fabricate Agent Behavior and Reasoning

A .NET framework for building distributed AI agent systems on Orleans and Microsoft Agent Framework. Durable, scalable agents with built-in LLM providers, tool execution, MCP integration, and real-time monitoring.

.NET 10 Orleans 10 Microsoft.Agents.AI Microsoft.Extensions.AI OpenTelemetry
// Define an agent with FabrCore
[AgentAlias("my-assistant")]
public class MyAssistant : FabrCoreAgentProxy
{
    public override async Task<AgentMessage>
        OnMessage(AgentMessage message)
    {
        var (agent, session) =
            await Host.CreateChatClientAgent(
                modelName: "AzureProd",
                instructions: "You are a helper."
            );

        var response = await agent
            .SendAsync(session, message.Text);
        return message.ToReply(response);
    }
}

The Infrastructure Agents Need

FabrCore provides the foundational infrastructure that AI agent applications require — so you can focus on what your agent does, not on the plumbing underneath. Every agent type inherits from FabrCoreAgentProxy and gets the full infrastructure for free.

Fabricate

Intentional creation. Define agent types, configure behaviors, wire up tools and MCP servers — all through familiar .NET patterns like dependency injection and configuration.

Behavior

Complex, autonomous workflows that adapt and respond. Inter-agent messaging, fan-out/gather patterns, pipeline orchestration — real agent behaviors, not just API wrappers.

Reasoning

Multi-step problem solving with structured output, tool selection, context management, and automatic conversation summarization to stay within model limits.

FabrCore Distributed Agent Architecture
FabrCore Distributed Agent Architecture

4-Package Architecture

Modular design — use only what you need

FabrCore.Core

Core interfaces, data models, and grain abstractions. The shared contracts everything else builds on.

NuGet

FabrCore.Sdk

FabrCoreAgentProxy, plugins, tools, MCP integration, and agent monitoring.

NuGet

FabrCore.Host

Orleans silo, REST API, chat completions endpoint, WebSocket middleware, and system agents.

NuGet

FabrCore.Client

ClientContext, ChatDock Blazor component, health monitoring, and agent messaging.

NuGet

Framework Capabilities

Everything your agent application needs, built into the framework

Agent Lifecycle Management

Initialization, health reporting, message processing, and graceful shutdown. Orleans grains with durable state, timers, and reminders.

Chat History Persistence

Thread-based message storage with Orleans grain state, buffered writes, and atomic replacement. Conversations survive restarts and are scoped per agent per thread.

Chat History Compaction

Automatic LLM-powered conversation summarization when approaching context window limits. Preserves key context while keeping token usage under control.

Plugins and Tools

Stateful plugins via IFabrCorePlugin and stateless standalone tools with [ToolAlias]. Registry-based discovery with collision detection and full DI support.

Multi-LLM Support

Azure OpenAI, OpenAI, Anthropic, and custom providers via Microsoft.Extensions.AI. Named configurations let agents use different models for different tasks.

Structured Output

JSON schema extraction for deterministic LLM responses. Define a C# class, get validated structured data back from the model — no manual parsing.

MCP Integration

Model Context Protocol server support via Stdio and HTTP transports. Connect agents to external tool ecosystems with config-driven or code-driven setup.

Real-Time Monitoring

Agent message traffic observation, event stream monitoring, LLM request/response capture, and token usage tracking with IAgentMessageMonitor.

ChatDock UI

Floating chat panel component for Blazor Server with configurable positions, scoped instances, and multi-ChatDock support via ChatDockManager.

Inter-Agent Messaging

Fan-out/gather, pipeline, and supervisor patterns with handle-based routing. ACL-based access control for agent-to-agent communication.

WebSocket & REST API

Full REST API with chat completions endpoint and WebSocket middleware for real-time bidirectional communication. Agent management, messaging, and health endpoints.

Testing Harness

In-memory agent testing with TestFabrCoreAgentHost. Mock LLM mode with FakeChatClient or live LLM integration tests. MSTest patterns included.

The FabrCoreAgentProxy Pattern

One base class. Full infrastructure. Zero boilerplate.

What You Get for Free

Chat persistence and thread management
Automatic context window compaction
Plugin, tool, and MCP resolution from DI
Health reporting and diagnostics
OpenTelemetry tracing
Orleans grain state management
Real-time message and LLM monitoring
Inter-agent messaging with ACL
Structured output extraction

What You Implement

1 OnMessage() — handle each incoming message
2 OnInitialize() — optional setup logic
3 OnEvent() — optional event handling

Whether it's a simple assistant, a multi-step workflow planner, or an intelligent message router — your agent type just decides what to do with the message. The infrastructure handles the rest.

Built on .NET, For .NET

Not a Python project with .NET bindings. A native .NET framework that uses the patterns you already know.

Dependency Injection

Agents, plugins, and services all register through Microsoft.Extensions.DependencyInjection. Standard .NET service registration — no custom containers or magic.

Configuration

JSON-based agent and LLM provider configuration through Microsoft.Extensions.Configuration. Define models, API keys, and behaviors in fabrcore.json.

Orleans 10

Distributed state management through Orleans grains. Clustering with Azure Storage or SQL Server, persistence, streaming, reminders, and multi-silo deployment.

Microsoft Agent Framework

Built on Microsoft.Agents.AI 1.0 with ChatClientAgent, agent sessions, and thread patterns for standardized AI agent interactions.

Microsoft.Extensions.AI

Unified AI abstractions for multi-provider LLM support. Switch between Azure OpenAI, OpenAI, Anthropic, and custom providers without changing agent code.

OpenTelemetry

Built-in tracing and observability. Every agent message, tool call, and LLM interaction is traced — plug into your existing observability stack.

Open Source — MIT License

OpenCaddis

The reference implementation that proves what FabrCore can do

A framework without an application is just documentation. OpenCaddis is a full AI agent workspace built on FabrCore — a Blazor Server application where you create, configure, and interact with multiple AI agents through a chat interface.

It runs entirely on your machine. Conversations, agent configurations, vector memories — all stored locally in SQLite. The only external calls are to your configured LLM provider.

4 Agent Types
Assistant, Delegate, Workflow, Event Log
10 Plugins
WebBrowser, PowerShell, FileSystem, Memory, and more
Vector Memory
Semantic search with local embeddings
CaddisFly
Pipeline orchestration engine

Get Started

Install, build, and run in minutes

FabrCore Framework

The engine for building and orchestrating distributed AI agents.

LicenseApache 2.0
Stack.NET 10, Orleans 10
AI SDKMicrosoft.Agents.AI

OpenCaddis

Full AI agent workspace built on FabrCore.

LicenseMIT
Stack.NET 10, Blazor, SQLite
Dockervulcan365/opencaddis
GitHub
# Install FabrCore.Host (pulls in Sdk and Core)
dotnet add package FabrCore.Host

# For client applications
dotnet add package FabrCore.Client

# Or try OpenCaddis with Docker
docker run -d -p 5000:5000 --name opencaddis vulcan365/opencaddis:latest

Need Help Building on FabrCore?

Vulcan365 offers consulting and development services for teams building AI agent systems with FabrCore and .NET.