Skip to main content
Tool calling or function calling allows LLMs to interact with external systems by requesting the execution of specific functions. The Uno SDK provides a unified interface for defining tools and handling tool call requests from various LLM providers.

Defining Tools

Tools are defined as part of the responses.Request struct using the Tools field.
// Define a function tool
getWeatherTool := responses.ToolUnion{
    OfFunction: &responses.FunctionTool{
        Name:        "get_current_weather",
        Description: utils.Ptr("Get the current weather in a given location"),
        Parameters: map[string]any{
            "type": "object",
            "properties": map[string]any{
                "location": map[string]any{
                    "type":        "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "unit": map[string]any{
                    "type": "string",
                    "enum": []string{"celsius", "fahrenheit"},
                },
            },
            "required": []string{"location"},
        },
    },
}

Making a Request with Tools

Include the defined tools in your request to the model.
resp, err := model.NewResponses(ctx, &responses.Request{
    Input: responses.InputUnion{
        OfString: utils.Ptr("What's the weather like in Paris?"),
    },
    Tools: []responses.ToolUnion{getWeatherTool},
})

Handling Tool Calls

You can iterate through the Output list to find any requested tool calls.
for _, output := range resp.Output {
    // Check if the output item is a function call
    if output.OfFunctionCall != nil {
        fnCall := output.OfFunctionCall
        fmt.Printf("Model requested tool: %s\n", fnCall.Name)
        fmt.Printf("Arguments: %s\n", fnCall.Arguments)
        
        // Execute your local logic here...
    }
}

Streaming Tool Calls

The following example demonstrates how to detect and process a tool call request from a streaming response.
import (
    "fmt"
    "github.com/curaious/uno/pkg/llm/responses"
)

// ... request initialization with Tools ...

stream, err := model.NewStreamingResponses(ctx, request)
if err != nil {
    panic(err)
}

for chunk := range stream {
    // Detect when an output item is completed
    if chunk.ChunkType() == "response.output_item.done" {
        item := chunk.OfOutputItemDone.Item
        
        // Check if the completed item is a function call
        if item.Type == "function_call" {
            fmt.Printf("Model requested tool: %s\n", *item.Name)
            fmt.Printf("Arguments: %s\n", *item.Arguments)
            
            // Execute your local logic here...
        }
    }
}

Returning Tool Results

After executing the requested tool, you can send the results back to the model to continue the conversation. Use the OfFunctionCallOutput message type in the InputMessageList.
resp, err := model.NewResponses(ctx, &responses.Request{
    Input: responses.InputUnion{
        OfInputMessageList: responses.InputMessageList{
            // ... original user message ...
            // ... original tool call request from model ...
            {
                OfFunctionCallOutput: &responses.FunctionCallOutputMessage{
                    CallID: "call_123", // Must match the ID from the request
                    Output: responses.FunctionCallOutputContentUnion{
                        OfString: utils.Ptr(`{"temperature": 22, "condition": "Sunny"}`),
                    },
                },
            },
        },
    },
})