Skip to main content
Uno is a comprehensive platform for building LLM-powered applications. It offers two complementary products:
  • Uno SDK — A Golang SDK for making LLM calls, building AI agents, and orchestrating complex workflows across multiple providers.
  • Uno Gateway — A server that acts as an LLM gateway with virtual key management, observability, and a no-code agent builder with a built-in conversational UI.

Uno SDK

A Golang SDK for developers who want fine-grained control over LLM interactions and agent orchestration. The Uno SDK abstracts away provider differences, letting you switch between OpenAI, Anthropic, and Gemini with a single line change. Build agents, connect MCP tools, and get structured outputs—all through a unified API.

Core Capabilities

Multi-Provider Support

Write once, deploy anywhere. Seamlessly switch between OpenAI, Anthropic, and Gemini without rewriting your application logic.

Agent SDK

Build sophisticated AI agents with system instructions, tools, conversation history, and multi-step reasoning out of the box.

MCP Integration

Connect to any Model Context Protocol (MCP) server and extend your agents with external tools and resources.

Structured Outputs

Define JSON schemas to ensure consistent, validated responses from LLMs—perfect for data extraction and API responses.

Quick Examples

Use the SDK as a lightweight wrapper to call any LLM provider. Simply change the Provider and Model to switch between providers—your application logic stays the same:
model := client.NewLLM(sdk.LLMOptions{
    Provider: llm.ProviderNameOpenAI,
    Model:    "gpt-4.1-mini",
})

resp, _ := model.NewResponses(ctx, &responses.Request{
    Instructions: utils.Ptr("You are a helpful assistant."),
    Input: responses.InputUnion{
        OfString: utils.Ptr("What is the capital of France?"),
    },
})

Get Started with the SDK →

Install the SDK and make your first LLM call in under 5 minutes.

Uno Gateway

Uno Gateway is a server that provides two key capabilities:
Point any OpenAI, Anthropic, or Gemini SDK at the gateway and unlock virtual keys, rate limiting, and request logging.

Virtual Keys

Protect your provider API keys. Generate virtual keys with usage limits and revoke them instantly when needed.

Drop-in Replacement

Point your existing OpenAI, Anthropic, or Google GenAI client to the gateway and get access to all gateway features.

Deep Observability

Track every request with OpenTelemetry. Monitor token usage, latency, and costs with ClickHouse-powered analytics.

Request Logging

Every LLM call is logged with full request/response details for debugging and compliance.

Drop-in Replacement

Point any existing SDK to the gateway—it just works:
client := openai.NewClient(
    option.WithBaseURL("http://localhost:6060/api/gateway/openai"),
    option.WithAPIKey("your-virtual-key"),
)

Key Management

Manage provider keys, virtual keys with rate limits defined per key.
Uno Gateway Dashboard

Gateway Logs

View all LLM requests with full details—prompts, responses, token usage, and latency metrics.
LLM Gateway Logs

Deploy the Gateway →

Get Uno Gateway running locally in less than 5 minutes.

Choose Your Path

SDK

Build with the SDK

Use the Golang SDK for maximum flexibility and control over your LLM applications

Gateway

Deploy the Gateway

Get virtual keys, observability, and no-code agent building out of the box