System instructions define the behavior and personality of an agent. The SDK provides flexible ways to create system instructions, from simple strings to dynamic templates loaded from remote sources.
Basic Usage
The simplest way to provide a system instruction is using a plain string:
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Assistant",
Instruction: client.Prompt("You are a helpful assistant."),
LLM: model,
})
Template Variables
System instructions support template variables using Go template syntax. Variables are resolved at runtime using resolvers.
Simple Template Variables
Use {{variable}} syntax in your instruction string, and provide a resolver to populate the values:
contextData := map[string]any{
"name": "Alice",
"role": "developer",
}
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Personalized Assistant",
Instruction: client.Prompt(
"You are a helpful assistant. You are interacting with {{name}}, who is a {{role}}.",
),
LLM: model,
})
agent.Execute(context.Background(), &agents.AgentInput{
RunContext: map[string]any{
"name": "Bob",
"role": "admin",
},
})
The template syntax {{variable}} is automatically converted to Go template format {{ .variable }} for resolution.
Custom Resolvers
You can create custom resolvers functions for more complex template resolution logic. The resolver function will have access to the raw prompt and the run context.
import "github.com/curaious/uno/pkg/agent-framework/prompts"
customResolver := func(promptStr string, data map[string]any) (string, error) {
// Your logic to resolve the prompt into the final system format
}
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Dynamic Assistant",
Instruction: client.Prompt("Hello {{userName}}, current time is {{timestamp}}", prompts.WithResolver(customResolver)),
LLM: model,
})
Remote Prompts
You can load system instructions from the Uno Gateway server using RemotePrompt.
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Remote Assistant",
Instruction: client.RemotePrompt("my-prompt-name", "production"),
LLM: model,
})
Parameters
name: The name of the prompt stored in the Uno Gateway server
label: The prompt label ("production" or "latest")
resolvers: Optional resolvers for template variable resolution
For this to work, you should have configured the client with the gateway’s endpoint, project name and credentials.
Remote Prompt with Resolvers
You can combine remote prompts with resolvers for dynamic content:
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Versioned Assistant",
Instruction: client.RemotePrompt(
"assistant-prompt",
"production",
),
LLM: model,
})
contextData := map[string]any{
"environment": "production",
"version": "1.0.0",
}
agent.Execute(context.Background(), &agents.AgentInput{
RunContext: contextData,
})
Custom Prompt Loaders
For advanced use cases, you can implement a custom PromptLoader interface to load prompts from any source:
type CustomLoader struct {
// Your custom fields
}
func (l *CustomLoader) LoadPrompt(ctx context.Context) (string, error) {
// Load prompt from your custom source
// e.g., database, file system, external API, etc.
return "Your prompt content", nil
}
customLoader := &CustomLoader{}
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Custom Assistant",
Instruction: client.CustomPrompt(customLoader),
LLM: model,
})
Complete Example
Here’s a complete example demonstrating different ways to use system instructions:
package main
import (
"context"
"log"
"github.com/curaious/uno/pkg/agent-framework/core"
"github.com/curaious/uno/pkg/agent-framework/prompts"
"github.com/curaious/uno/pkg/gateway"
"github.com/curaious/uno/pkg/llm"
"github.com/curaious/uno/pkg/llm/responses"
"github.com/curaious/uno/pkg/sdk"
)
func main() {
client, err := sdk.New(&sdk.ClientOptions{
LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
{
ProviderName: llm.ProviderNameOpenAI,
BaseURL: "",
CustomHeaders: nil,
ApiKeys: []*gateway.APIKeyConfig{
{
Name: "Key 1",
APIKey: "",
},
},
},
}),
})
if err != nil {
log.Fatal(err)
}
model := client.NewLLM(sdk.LLMOptions{
Provider: llm.ProviderNameOpenAI,
Model: "gpt-4o-mini",
})
// Example 1: Simple string instruction
agent1 := client.NewAgent(&sdk.AgentOptions{
Name: "Simple Assistant",
Instruction: client.Prompt("You are a helpful assistant."),
LLM: model,
})
// Example 2: Template with variables
agent2 := client.NewAgent(&sdk.AgentOptions{
Name: "Personalized Assistant",
Instruction: client.Prompt(
"You are a helpful assistant for {{userName}}, who is a {{userRole}}.",
),
LLM: model,
})
// Example 3: Remote prompt (requires endpoint and projectName)
// agent3 := client.NewAgent(&sdk.AgentOptions{
// Name: "Remote Assistant",
// Instruction: client.RemotePrompt("my-prompt", "production"),
// LLM: model,
// })
// Execute agent
out, err := agent2.Execute(context.Background(), &agents.AgentInput{
Messages: []responses.InputMessageUnion{
responses.UserMessage("Hello!"),
},
RunContext: map[string]any{
"userName": "Bob",
"userRole": "developer",
},
})
if err != nil {
log.Fatal(err)
}
log.Println(out[0].OfOutputMessage.Content[0].OfOutputText.Text)
}
Notes
- Template variables use the syntax
{{variable}} which is automatically converted to Go template format
- Remote prompts require the SDK client to be initialized with
endpoint and projectName
- Custom prompt loaders must implement the
PromptLoader interface with a LoadPrompt(ctx context.Context) (string, error) method