HasteKit LLM Gateway supports Anthropic’s Messages API for text generation, images, tool calling, and reasoning. You can use any Anthropic SDK by simply pointing it to HasteKit’s gateway endpoint. This allows you to leverage HasteKit’s features like virtual keys, provider management, and observability while using your existing Anthropic code.Documentation Index
Fetch the complete documentation index at: https://hastekit.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Overview
The HasteKit LLM Gateway provides an endpoint at/api/gateway/anthropic/v1/messages that implements Anthropic’s Messages API specification. This means you can use any Anthropic SDK (Python, JavaScript, Go, etc.) without modifying your code - just change the base URL.
Usage Examples
main.py
Streaming Support
The gateway supports streaming responses via Server-Sent Events (SSE). Use your SDK’s streaming methods as you normally would:streaming.py
Supported Features
The gateway currently supports the Messages API with:- ✅ Text generation - Standard text completions
- ✅ Images - Image input (vision capabilities)
- ✅ Tool calling - Function calling capabilities
- ✅ Reasoning - Advanced reasoning models
Authentication
The gateway accepts authentication via thex-api-key header:
- Virtual Key: Use a virtual key (starts with
sk-hk-) for managed access control - Direct API Key: Use your Anthropic API key directly