FabrCore
Fabricate Agent Behavior and Reasoning
A .NET framework for building distributed AI agent systems on Orleans and Microsoft Agent Framework. Durable, scalable agents with built-in LLM providers, tool execution, MCP integration, and real-time monitoring.
// Define an agent with FabrCore
[AgentAlias("my-assistant")]
public class MyAssistant : FabrCoreAgentProxy
{
public override async Task<AgentMessage>
OnMessage(AgentMessage message)
{
var (agent, session) =
await Host.CreateChatClientAgent(
modelName: "AzureProd",
instructions: "You are a helper."
);
var response = await agent
.SendAsync(session, message.Text);
return message.ToReply(response);
}
}
The Infrastructure Agents Need
FabrCore provides the foundational infrastructure that AI agent applications require — so you can focus on what your agent does, not on the plumbing underneath. Every agent type inherits from FabrCoreAgentProxy and gets the full infrastructure for free.
Fabricate
Intentional creation. Define agent types, configure behaviors, wire up tools and MCP servers — all through familiar .NET patterns like dependency injection and configuration.
Behavior
Complex, autonomous workflows that adapt and respond. Inter-agent messaging, fan-out/gather patterns, pipeline orchestration — real agent behaviors, not just API wrappers.
Reasoning
Multi-step problem solving with structured output, tool selection, context management, and automatic conversation summarization to stay within model limits.
4-Package Architecture
Modular design — use only what you need
FabrCore.Core
Core interfaces, data models, and grain abstractions. The shared contracts everything else builds on.
NuGetFabrCore.Host
Orleans silo, REST API, chat completions endpoint, WebSocket middleware, and system agents.
NuGetFabrCore.Client
ClientContext, ChatDock Blazor component, health monitoring, and agent messaging.
Framework Capabilities
Everything your agent application needs, built into the framework
Agent Lifecycle Management
Initialization, health reporting, message processing, and graceful shutdown. Orleans grains with durable state, timers, and reminders.
Chat History Persistence
Thread-based message storage with Orleans grain state, buffered writes, and atomic replacement. Conversations survive restarts and are scoped per agent per thread.
Chat History Compaction
Automatic LLM-powered conversation summarization when approaching context window limits. Preserves key context while keeping token usage under control.
Plugins and Tools
Stateful plugins via IFabrCorePlugin and stateless standalone tools with [ToolAlias]. Registry-based discovery with collision detection and full DI support.
Multi-LLM Support
Azure OpenAI, OpenAI, Anthropic, and custom providers via Microsoft.Extensions.AI. Named configurations let agents use different models for different tasks.
Structured Output
JSON schema extraction for deterministic LLM responses. Define a C# class, get validated structured data back from the model — no manual parsing.
MCP Integration
Model Context Protocol server support via Stdio and HTTP transports. Connect agents to external tool ecosystems with config-driven or code-driven setup.
Real-Time Monitoring
Agent message traffic observation, event stream monitoring, LLM request/response capture, and token usage tracking with IAgentMessageMonitor.
ChatDock UI
Floating chat panel component for Blazor Server with configurable positions, scoped instances, and multi-ChatDock support via ChatDockManager.
Inter-Agent Messaging
Fan-out/gather, pipeline, and supervisor patterns with handle-based routing. ACL-based access control for agent-to-agent communication.
WebSocket & REST API
Full REST API with chat completions endpoint and WebSocket middleware for real-time bidirectional communication. Agent management, messaging, and health endpoints.
Testing Harness
In-memory agent testing with TestFabrCoreAgentHost. Mock LLM mode with FakeChatClient or live LLM integration tests. MSTest patterns included.
The FabrCoreAgentProxy Pattern
One base class. Full infrastructure. Zero boilerplate.
What You Get for Free
What You Implement
OnMessage() — handle each incoming message
OnInitialize() — optional setup logic
OnEvent() — optional event handling
Whether it's a simple assistant, a multi-step workflow planner, or an intelligent message router — your agent type just decides what to do with the message. The infrastructure handles the rest.
Built on .NET, For .NET
Not a Python project with .NET bindings. A native .NET framework that uses the patterns you already know.
Dependency Injection
Agents, plugins, and services all register through Microsoft.Extensions.DependencyInjection. Standard .NET service registration — no custom containers or magic.
Configuration
JSON-based agent and LLM provider configuration through Microsoft.Extensions.Configuration. Define models, API keys, and behaviors in fabrcore.json.
Orleans 10
Distributed state management through Orleans grains. Clustering with Azure Storage or SQL Server, persistence, streaming, reminders, and multi-silo deployment.
Microsoft Agent Framework
Built on Microsoft.Agents.AI 1.0 with ChatClientAgent, agent sessions, and thread patterns for standardized AI agent interactions.
Microsoft.Extensions.AI
Unified AI abstractions for multi-provider LLM support. Switch between Azure OpenAI, OpenAI, Anthropic, and custom providers without changing agent code.
OpenTelemetry
Built-in tracing and observability. Every agent message, tool call, and LLM interaction is traced — plug into your existing observability stack.
OpenCaddis
The reference implementation that proves what FabrCore can do
A framework without an application is just documentation. OpenCaddis is a full AI agent workspace built on FabrCore — a Blazor Server application where you create, configure, and interact with multiple AI agents through a chat interface.
It runs entirely on your machine. Conversations, agent configurations, vector memories — all stored locally in SQLite. The only external calls are to your configured LLM provider.
Get Started
Install, build, and run in minutes
OpenCaddis
Full AI agent workspace built on FabrCore.
| License | MIT |
| Stack | .NET 10, Blazor, SQLite |
| Docker | vulcan365/opencaddis |
# Install FabrCore.Host (pulls in Sdk and Core)
dotnet add package FabrCore.Host
# For client applications
dotnet add package FabrCore.Client
# Or try OpenCaddis with Docker
docker run -d -p 5000:5000 --name opencaddis vulcan365/opencaddis:latest
Need Help Building on FabrCore?
Vulcan365 offers consulting and development services for teams building AI agent systems with FabrCore and .NET.
Birmingham, Alabama