Agentic
A lightweight .NET library for building LLM-powered agents.
Agentic is a lightweight .NET library for building LLM-powered agents with streaming chat, MCP tool hosting, context compaction, and vector storage — all via a clean, attribute-driven API.
Features
| Feature | Description |
|---|---|
ILLMBackend | Unified abstraction over any inference source; swap backends without touching agent code |
OpenAIBackend | OpenAI-compatible REST client with streaming, embeddings, vision and health-check |
NativeBackend | Local llama.cpp inference with auto-install from GitHub releases |
LlamaRuntimeInstaller | On-demand runtime installer for CPU, CUDA and Vulkan on Windows and Linux |
| Agent | Multi-turn streaming agent with automatic MCP tool orchestration |
| Image Input | Send images alongside text as URL, local file, or base64 |
| Workflows | Ordered multi-step execution with per-step async guardrails and retry |
| Tool System | Define tools with [Tool] / [ToolParam] attributes; zero boilerplate |
| Tool Context | HTTP headers forwarded to tool methods via ToolContext |
| MCP Server | Expose any IAgentToolSet over HTTP as a Model Context Protocol server |
| Context Compaction | Auto-summarise older history into a structured checkpoint |
| Vector Storage | IStore / ICollection<T> with SQLite or PostgreSQL + pgvector |
| Reasoning Control | Control chain-of-thought effort at global, agent, or request level |
| Inference Config | Sampling and penalty parameters with three-level override |
Installation
dotnet add package Theoistic.Agentic
Requirements: .NET 10 · ASP.NET Core (included via
Microsoft.AspNetCore.Appframework reference)
Quick Example
using Agentic;
// 1. Connect to any OpenAI-compatible endpoint
var lm = new OpenAIBackend(new LMConfig
{
Endpoint = "http://localhost:1234",
ModelName = "your-model-name",
});
// 2. Create an agent
var agent = new Agent(lm, new AgentOptions
{
SystemPrompt = "You are a helpful assistant.",
OnEvent = e =>
{
if (e.Kind == AgentEventKind.TextDelta)
Console.Write(e.Text);
},
});
// 3. Chat
await agent.ChatStreamAsync("Hello! What can you do?");