Getting Started
Get up and running with Layers in under 5 minutes.
Ready to start?
What is Layers?#
Layers is a unified AI API gateway that gives you access to 40 models (19 language, 21 image/multimodal) from 6 providers through a single API key and credit balance.
Powered by Hustle Together AI SDK - Layers is built on the Hustle Together AI SDK, which provides the model registry (40 models from 6 providers: Anthropic, OpenAI, Google, Perplexity, Morph, BFL, Recraft), real-time pricing data synced daily, and routing infrastructure via Vercel AI Gateway. Layers adds credit management, authentication, rate limiting, and Stripe billing.
| Challenge | Layers Solution |
|---|---|
| Managing multiple API keys | One API key for all providers |
| Different billing accounts | Single credit balance |
| Inconsistent APIs | OpenAI-compatible format |
| Rate limit management | Unified rate limiting by tier |
| Usage tracking | Built-in analytics dashboard |
Quick Start#
Get Your API Key
Sign up at layers.hustletogether.com/dashboard to create your API key. Your key will look like:
lyr_live_sk_1234567890abcdef...Make Your First Request
Use cURL, TypeScript, or Python to make your first API call:
cURL
curl -X POST https://layers.hustletogether.com/api/v1/chat \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "anthropic/claude-sonnet-4.5",
"messages": [
{"role": "user", "content": "Say hello!"}
]
}'TypeScript
const response = await fetch('https://layers.hustletogether.com/api/v1/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer YOUR_API_KEY',
},
body: JSON.stringify({
model: 'anthropic/claude-sonnet-4.5',
messages: [{ role: 'user', content: 'Say hello!' }],
}),
});
const data = await response.json();
console.log(data.choices[0].message.content);Python
import requests
response = requests.post(
'https://layers.hustletogether.com/api/v1/chat',
headers={
'Content-Type': 'application/json',
'Authorization': 'Bearer YOUR_API_KEY'
},
json={
'model': 'anthropic/claude-sonnet-4.5',
'messages': [{'role': 'user', 'content': 'Say hello!'}]
}
)
print(response.json()['choices'][0]['message']['content'])Check the Response
A successful response looks like:
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"model": "anthropic/claude-sonnet-4.5",
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 12,
"completion_tokens": 8,
"total_tokens": 20
},
"layers": {
"credits_used": 0.05,
"latency_ms": 342,
"cost_breakdown": {
"base_cost_usd": 0.00031,
"margin_percent": 60,
"total_cost_usd": 0.0005
}
}
}Try Different Models#
Just change the model parameter:
// Fast and cheap
"model": "openai/gpt-4o-mini"
// Balanced performance
"model": "anthropic/claude-sonnet-4.5"
// Best quality
"model": "anthropic/claude-opus-4.5"
// Web search included
"model": "perplexity/sonar-pro"See the Model Selection Guide for all 24 models.
Using with Vercel AI SDK#
If you're building with Next.js, use the Vercel AI SDK with Layers as an OpenAI-compatible provider:
npm install ai @ai-sdk/openaiimport { createOpenAI } from '@ai-sdk/openai';
// Create Layers provider using OpenAI adapter
export const layers = createOpenAI({
baseURL: 'https://layers.hustletogether.com/api/v1',
apiKey: process.env.LAYERS_API_KEY,
});import { generateText } from 'ai';
import { layers } from '@/lib/layers';
export async function POST(req: Request) {
const { prompt } = await req.json();
const { text } = await generateText({
model: layers('anthropic/claude-sonnet-4.5'),
prompt,
});
return Response.json({ text });
}Subscription Tiers#
| Tier | Price | Credits | Rate Limit |
|---|---|---|---|
| Free | $0 | 50/month | 10 req/min |
| Starter | $20/mo | 500/month | 60 req/min |
| Pro | $100/mo | 3,000/month | 300 req/min |
| Team | $200/mo | 7,500/month | 1,000 req/min |
See Billing & Credits for detailed pricing information.