Skip to main content
Uno SDK supports generating text content with various LLM providers like OpenAI, Anthropic, and Gemini.

Instantiate a model

You first need to create an LLM model instance from your initialized Uno client by specifying the provider name and the model name you want to use.
model := client.NewLLM(sdk.LLMOptions{
    Provider: llm.ProviderNameOpenAI,
    Model:    "gpt-4.1-mini",
})

Invoke the model

You can then invoke the model with parameters like system instruction, temperature, topK etc.
resp, err := model.NewResponses(context.Background(), &responses.Request{
    Instructions: utils.Ptr("You are a helpful assistant."),
    Input: responses.InputUnion{
        OfString: utils.Ptr("What is the capital of France?"),
    },
})

Response

The model returns an array of outputs. The output can be of various types - text, images, reasoning etc. You can access the text content like this:
// Access the text output
for _, output := range resp.Output {
    if output.OfOutputMessage != nil {
        for _, content := range output.OfOutputMessage.Content {
            if content.OfOutputText != nil {
                fmt.Println(content.OfOutputText.Text)
            }
        }
    }
}

Streaming Responses

For real-time applications, the SDK supports streaming responses. This returns a channel that yields chunks of the response as they are generated by the LLM.
import (
    "context"
    "fmt"
    "github.com/curaious/uno/pkg/llm/responses"
    "github.com/curaious/uno/internal/utils"
)

func main() {
    // ... client and model initialization ...

    stream, err := model.NewStreamingResponses(context.Background(), &responses.Request{
        Input: responses.InputUnion{
            OfString: utils.Ptr("Write a poem about coding."),
        },
    })
    if err != nil {
        panic(err)
    }

    for chunk := range stream {
        // Handle different types of chunks
        if chunk.OfOutputTextDelta != nil {
            fmt.Print(chunk.OfOutputTextDelta.Delta)
        }
    }
}

Using Multi-turn Conversation

Instead of a single string, you can pass a list of messages for context-aware conversations.
resp, err := model.NewResponses(ctx, &responses.Request{
    Input: responses.InputUnion{
        OfInputMessageList: responses.InputMessageList{
            {
                OfEasyInput: &responses.EasyMessage{
                    Role:    "user",
                    Content: responses.EasyInputContentUnion{OfString: utils.Ptr("Hi!")},
                },
            },
            {
                OfEasyInput: &responses.EasyMessage{
                    Role:    "assistant",
                    Content: responses.EasyInputContentUnion{OfString: utils.Ptr("Hello! How can I help you?")},
                },
            },
            {
                OfEasyInput: &responses.EasyMessage{
                    Role:    "user",
                    Content: responses.EasyInputContentUnion{OfString: utils.Ptr("Tell me a joke.")},
                },
            },
        },
    },
})