Documentation Index
Fetch the complete documentation index at: https://hastekit.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Conversation history enables agents to maintain context across multiple interactions. By preserving previous messages, agents can reference earlier exchanges and build upon prior context, facilitating natural, context-aware conversations.
Overview
When a conversation history manager is provided to an agent, it performs the following operations:
- Loads Previous Messages: Automatically retrieves conversation history based on the previous message ID
- Saves Automatically: Persists new messages after each execution
Enabling Conversation History
To enable conversation history, create an instance of ConversationManager and pass the created instance to the agent’s History field. While invoking the agent, optionally set Namespace to bucket the conversations by namespaces.
cm := client.NewConversationManager()
agent := agents.NewAgent(&agents.AgentOptions{
Name: "Hello world agent",
LLM: model,
History: cm,
})
handle, err := agent.Execute(context.Background(), &agents.AgentInput{
Namespace: "default",
Messages: []responses.InputMessageUnion{
responses.UserMessage("Hello! My name is Alice"),
},
})
out, err := handle.Result()
Continuing the conversation
To continue the conversation, pass same thread ID while invoking the agent again.
// First invocation
threadID := uuid.NewString()
handle, err := agent.Execute(context.Background(), &agents.AgentInput{
Namespace: "default",
ThreadID: threadID,
Messages: []responses.InputMessageUnion{
responses.UserMessage("Hello! My name is Alice"),
},
})
out, err := handle.Result()
// Second invocation
handle, err = agent.Execute(context.Background(), &agents.AgentInput{
Namespace: "default",
ThreadID: threadID, // Pass the same thread ID to continue the conversation
Messages: []responses.InputMessageUnion{
responses.UserMessage("What's my name?"),
},
})
out, err = handle.Result()
Persistence
The conversation manager supports three persistence configurations for storing conversation history:
1. No Persistence
When the SDK client is initialized without endpoint and projectName, messages are not persisted by default. History remains available in-memory for the lifetime of the client itself.
2. HasteKit Gateway Persistence
When the SDK client is initialized with endpoint and projectName, the conversation manager automatically persists messages to the HasteKit’s Gateway server.
3. Custom Persistence
To implement custom persistence, implement the ConversationPersistenceManager interface:
type ConversationPersistenceManager interface {
LoadMessages(ctx context.Context, namespace string, threadID string, previousMessageID string) ([]conversation.ConversationMessage, error)
SaveMessages(ctx context.Context, namespace, msgId, previousMsgId, threadID string, conversationId string, messages []responses.InputMessageUnion, meta map[string]any) error
SaveSummary(ctx context.Context, namespace string, summary conversation.Summary) error
}
Then pass your implementation to the conversation manager:
cm := client.NewConversationManager(history.WithPersistence(yourImpl))
Complete Example
The following example demonstrates an agent with conversation history:
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/bytedance/sonic"
"github.com/hastekit/hastekit-sdk-go/pkg/agents"
"github.com/hastekit/hastekit-sdk-go/pkg/gateway"
"github.com/hastekit/hastekit-sdk-go/pkg/gateway/llm"
"github.com/hastekit/hastekit-sdk-go/pkg/gateway/llm/responses"
hastekit "github.com/hastekit/hastekit-sdk-go"
)
func main() {
client, err := hastekit.New(&hastekit.ClientOptions{
ProviderConfigs: []gateway.ProviderConfig{
{
ProviderName: llm.ProviderNameOpenAI,
BaseURL: "",
CustomHeaders: nil,
ApiKeys: []*gateway.APIKeyConfig{
{
Name: "Key 1",
APIKey: os.Getenv("OPENAI_API_KEY"),
},
},
},
},
})
if err != nil {
log.Fatal(err)
}
model := client.NewLLM(hastekit.LLMOptions{
Provider: llm.ProviderNameOpenAI,
Model: "gpt-4.1-mini",
})
history := client.NewConversationManager()
agent := agents.NewAgent(&agents.AgentOptions{
Name: "Hello world agent",
Instruction: client.Prompt("You are helpful assistant."),
LLM: model,
History: history,
})
threadID := uuid.NewString()
handle, err := agent.Execute(context.Background(), &agents.AgentInput{
Namespace: "default",
ThreadID: threadID,
Messages: []responses.InputMessageUnion{
responses.UserMessage("Hello! My name is Alice"),
},
})
if err != nil {
log.Fatal(err)
}
out, err := handle.Result()
if err != nil {
log.Fatal(err)
}
b, _ := sonic.Marshal(out)
fmt.Println(string(b))
// Agent itself is stateless - you can either re-create another agent or reuse the same agent instance, but ensure to pass the correct `ThreadID`
agent2 := client.NewAgent(&hastekit.AgentOptions{
Name: "Hello world agent",
Instruction: client.Prompt("You are helpful assistant."),
LLM: model,
History: history,
})
handle, err = agent2.Execute(context.Background(), &agents.AgentInput{
Namespace: "default",
ThreadID: threadID,
Messages: []responses.InputMessageUnion{
responses.UserMessage("What's my name?"),
},
})
if err != nil {
log.Fatal(err)
}
out, err = handle.Result()
if err != nil {
log.Fatal(err)
}
b, _ = sonic.Marshal(out)
fmt.Println(string(b))
}