Skip to main content
A simple agent is the most basic type of agent in the Uno SDK. It executes in-process without durability, making it perfect for stateless interactions, testing, and simple use cases that don’t require crash recovery or long-running workflows.

Overview

Simple agents provide:
  • System instructions: Define the agent’s behavior and personality
  • LLM integration: Use any supported LLM provider (OpenAI, Anthropic, Gemini, etc.)
  • Tool support: Optional tools for function calling
  • Conversation history: Optional memory across interactions
  • Streaming responses: Real-time response chunks via callbacks
Unlike durable agents, simple agents execute synchronously and don’t persist state between runs. They’re ideal for stateless applications, quick prototypes, and scenarios where you don’t need crash recovery.

Creating a Simple Agent

To create a simple agent, use client.NewAgent() with AgentOptions:
import (
    "context"
    "fmt"
    "log"

    "github.com/curaious/uno/internal/utils"
    "github.com/curaious/uno/pkg/agent-framework/agents"
    "github.com/curaious/uno/pkg/gateway"
    "github.com/curaious/uno/pkg/llm"
    "github.com/curaious/uno/pkg/llm/responses"
    "github.com/curaious/uno/pkg/sdk"
)

// Initialize the SDK client
client, err := sdk.New(&sdk.ClientOptions{
    LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
        {
            ProviderName: llm.ProviderNameOpenAI,
            ApiKeys: []*gateway.APIKeyConfig{
                {Name: "default", APIKey: os.Getenv("OPENAI_API_KEY")},
            },
        },
    }),
})
if err != nil {
    log.Fatal(err)
}

// Create the agent
agent := client.NewAgent(&sdk.AgentOptions{
    Name:        "Hello world agent",
    Instruction: client.Prompt("You are helpful assistant. You greet user with a light-joke"),
    LLM: client.NewLLM(sdk.LLMOptions{
        Provider: llm.ProviderNameOpenAI,
        Model:    "gpt-4o-mini",
    }),
    Parameters: responses.Parameters{
        Temperature: utils.Ptr(0.2),
    },
})

AgentOptions Fields

FieldTypeDescription
NamestringA unique identifier for the agent
Instructioncore.SystemPromptProviderSystem prompt defining agent behavior (use client.Prompt() for simple strings)
LLMllm.ProviderThe LLM provider instance (created via client.NewLLM())
Parametersresponses.ParametersOptional LLM parameters (temperature, max tokens, etc.)
Tools[]core.ToolOptional array of tools the agent can use
History*history.CommonConversationManagerOptional conversation history manager
Outputmap[string]anyOptional JSON schema for structured output
McpServers[]*mcpclient.MCPClientOptional MCP server clients

Executing an Agent

Execute an agent using the Execute() method with AgentInput:
out, err := agent.Execute(context.Background(), &agents.AgentInput{
    Messages: []responses.InputMessageUnion{
        responses.UserMessage("Hello!"),
    },
})
if err != nil {
    log.Fatal(err)
}

// Access the response
fmt.Println(out[0].OfOutputMessage.Content[0].OfOutputText.Text)

AgentInput Fields

FieldTypeDescription
Messages[]responses.InputMessageUnionArray of input messages (use responses.UserMessage() helper)
NamespacestringOptional namespace for conversation isolation
PreviousMessageIDstringOptional ID of previous message for conversation continuity
RunContextmap[string]anyOptional context data for template variable resolution
Callbackfunc(chunk *responses.ResponseChunk)Optional callback for streaming responses

AgentOutput Structure

The Execute() method returns an AgentOutput:
type AgentOutput struct {
    RunID            string                        // Unique run identifier
    Status           core.RunStatus                // Execution status
    Output           []responses.InputMessageUnion  // Agent's response messages
    PendingApprovals []responses.FunctionCallMessage // Tool calls requiring approval
}

Complete Example

Here’s a complete working example:
package main

import (
    "context"
    "fmt"
    "log"
    "os"

    "github.com/curaious/uno/internal/utils"
    "github.com/curaious/uno/pkg/agent-framework/agents"
    "github.com/curaious/uno/pkg/gateway"
    "github.com/curaious/uno/pkg/llm"
    "github.com/curaious/uno/pkg/llm/responses"
    "github.com/curaious/uno/pkg/sdk"
)

func main() {
    // Initialize SDK client
    client, err := sdk.New(&sdk.ClientOptions{
        LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
            {
                ProviderName: llm.ProviderNameOpenAI,
                ApiKeys: []*gateway.APIKeyConfig{
                    {Name: "default", APIKey: os.Getenv("OPENAI_API_KEY")},
                },
            },
        }),
    })
    if err != nil {
        log.Fatal(err)
    }

    // Create agent
    agent := client.NewAgent(&sdk.AgentOptions{
        Name:        "Hello world agent",
        Instruction: client.Prompt("You are helpful assistant. You greet user with a light-joke"),
        LLM: client.NewLLM(sdk.LLMOptions{
            Provider: llm.ProviderNameOpenAI,
            Model:    "gpt-4o-mini",
        }),
        Parameters: responses.Parameters{
            Temperature: utils.Ptr(0.2),
        },
    })

    // Execute agent
    out, err := agent.Execute(context.Background(), &agents.AgentInput{
        Messages: []responses.InputMessageUnion{
            responses.UserMessage("Hello!"),
        },
    })
    if err != nil {
        log.Fatal(err)
    }

    // Print response
    fmt.Println(out[0].OfOutputMessage.Content[0].OfOutputText.Text)
}

Streaming Responses

To receive streaming responses, provide a Callback function in AgentInput:
out, err := agent.Execute(context.Background(), &agents.AgentInput{
    Messages: []responses.InputMessageUnion{
        responses.UserMessage("Tell me a story"),
    },
    Callback: func(chunk *responses.ResponseChunk) {
        // Handle different chunk types
        switch chunk.ChunkType() {
        case "response.output_text.delta":
            // Print text deltas as they arrive
            if chunk.OfOutputTextDelta != nil {
                fmt.Print(chunk.OfOutputTextDelta.Delta)
            }
        case "response.output_text.done":
            // Text generation complete
            if chunk.OfOutputTextDone != nil && chunk.OfOutputTextDone.Text != nil {
                fmt.Printf("\n\nComplete text: %s\n", *chunk.OfOutputTextDone.Text)
            }
        }
    },
})

Helper Functions

The SDK provides convenient helper functions for creating messages:
  • responses.UserMessage(msg string): Creates a user message from a string
  • responses.SystemMessage(msg string): Creates a system message
  • responses.AssistantMessage(msg string): Creates an assistant message

Next Steps