Features

Agentic is built around a set of composable features. Each feature is independent and can be used on its own or combined with others.

FeatureDescription
ILLMBackendUnified abstraction over any inference source; swap backends without touching agent code
OpenAIBackendOpenAI-compatible REST client with streaming, embeddings, vision and health-check
NativeBackendLocal llama.cpp inference with auto-install from GitHub releases
BackendRouterCompose multiple backends; route chat by model name, embeddings to a dedicated model
LlamaRuntimeInstallerOn-demand runtime installer for CPU, CUDA and Vulkan on Windows and Linux
AgentMulti-turn streaming agent with automatic MCP tool orchestration
Image InputSend images alongside text as URL, local file, or base64
WorkflowsOrdered multi-step execution with per-step async guardrails and retry
Tool SystemDefine tools with [Tool] / [ToolParam] attributes; zero boilerplate
Tool ContextHTTP headers forwarded to tool methods via ToolContext
MCP ServerExpose any IAgentToolSet over HTTP as a Model Context Protocol server
Context CompactionAuto-summarise older history into a structured checkpoint
Vector StorageIStore / ICollection<T> with SQLite or PostgreSQL + pgvector
Reasoning ControlControl chain-of-thought effort at global, agent, or request level
Inference ConfigSampling and penalty parameters with three-level override

Table of contents