API Rate Limiting
Last Updated: October 15, 2025 Category: Billing
Overview
The Waymaker One API uses rate limiting to ensure fair usage, maintain service quality, and prevent abuse. Rate limits are applied per organization, not per API key.
Key points:
- No subscription tiers - All organizations have the same rate limits
- Credit-based billing - You pay only for what you use
- Generous limits - Designed for production applications
- Per-organization - Limits apply across all API keys in your organization
Rate Limits
Standard Limits
All organizations receive:
| Limit Type | Value | Window |
|---|---|---|
| Requests per minute | 60 requests | 1 minute |
| Requests per hour | 1,000 requests | 1 hour |
| Requests per day | 10,000 requests | 24 hours |
| Concurrent requests | 10 simultaneous | Real-time |
These limits apply:
- Per organization - Shared across all API keys
- All endpoints - /v1/coordinate, /v1/framework, etc.
- Automatically - No configuration needed
Enterprise Limits
For high-volume applications, contact sales for:
- Higher request limits (500+ requests/minute)
- Dedicated capacity
- Custom rate limits per endpoint
- SLA guarantees
Contact: sales@waymakerone.com
Rate Limit Headers
Response Headers
Every API response includes rate limiting information:
HTTP/1.1 200 OK
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 54
X-RateLimit-Reset: 1697385600
X-RateLimit-Window: minute
| Header | Description |
|---|---|
X-RateLimit-Limit | Maximum requests per window |
X-RateLimit-Remaining | Remaining requests in current window |
X-RateLimit-Reset | Unix timestamp when window resets |
X-RateLimit-Window | Time window (minute, hour, day) |
Example reading headers:
const response = await fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: { 'apikey': process.env.WAYMAKER_API_KEY },
body: JSON.stringify({ messages })
});
const limit = response.headers.get('X-RateLimit-Limit');
const remaining = response.headers.get('X-RateLimit-Remaining');
const reset = response.headers.get('X-RateLimit-Reset');
console.log(`${remaining}/${limit} requests remaining`);
console.log(`Resets at: ${new Date(reset * 1000).toISOString()}`);
Rate Limit Errors
429 Too Many Requests
When rate limits are exceeded:
{
"error": "rate_limited",
"message": "Rate limit exceeded",
"code": "RATE_LIMIT_EXCEEDED",
"details": {
"limit": 60,
"window": "minute",
"resetTime": "2025-10-15T14:35:00Z",
"retryAfter": 42
}
}
Response status: 429 Too Many Requests
Error fields:
error: Error type identifiermessage: Human-readable descriptioncode: Machine-readable error codedetails.limit: Rate limit that was exceededdetails.window: Time window (minute, hour, day)details.resetTime: When the limit resets (ISO 8601)details.retryAfter: Seconds to wait before retry
Retry-After header:
HTTP/1.1 429 Too Many Requests
Retry-After: 42
Use this value to wait before retrying the request.
Handling Rate Limits
1. Monitor Headers
Check rate limit status before hitting the limit:
async function callAPI(request) {
const response = await fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: {
'apikey': process.env.WAYMAKER_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify(request)
});
// Check rate limit status
const remaining = parseInt(response.headers.get('X-RateLimit-Remaining'));
if (remaining < 5) {
console.warn('Approaching rate limit. Slowing requests.');
await new Promise(resolve => setTimeout(resolve, 1000));
}
return response;
}
2. Implement Exponential Backoff
Retry rate-limited requests with exponential backoff:
async function callAPIWithBackoff(request, maxRetries = 3) {
for (let attempt = 0; attempt < maxRetries; attempt++) {
try {
const response = await fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: {
'apikey': process.env.WAYMAKER_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify(request)
});
if (response.status === 429) {
const retryAfter = response.headers.get('Retry-After');
const waitTime = retryAfter
? parseInt(retryAfter) * 1000
: Math.pow(2, attempt) * 1000; // Exponential backoff
console.log(`Rate limited. Waiting ${waitTime}ms before retry.`);
await new Promise(resolve => setTimeout(resolve, waitTime));
continue;
}
return response;
} catch (error) {
if (attempt === maxRetries - 1) throw error;
}
}
throw new Error('Max retries exceeded');
}
3. Use Request Queuing
Queue requests to stay within rate limits:
class RateLimiter {
constructor(requestsPerMinute = 60) {
this.requestsPerMinute = requestsPerMinute;
this.queue = [];
this.processing = false;
this.requestTimestamps = [];
}
async addRequest(requestFn) {
return new Promise((resolve, reject) => {
this.queue.push({ requestFn, resolve, reject });
this.processQueue();
});
}
async processQueue() {
if (this.processing || this.queue.length === 0) return;
this.processing = true;
while (this.queue.length > 0) {
// Clean old timestamps (older than 1 minute)
const now = Date.now();
this.requestTimestamps = this.requestTimestamps.filter(
t => now - t < 60000
);
// Check if we can make another request
if (this.requestTimestamps.length >= this.requestsPerMinute) {
const oldestTimestamp = this.requestTimestamps[0];
const waitTime = 60000 - (now - oldestTimestamp);
console.log(`Rate limit reached. Waiting ${waitTime}ms.`);
await new Promise(resolve => setTimeout(resolve, waitTime));
continue;
}
// Process next request
const { requestFn, resolve, reject } = this.queue.shift();
this.requestTimestamps.push(Date.now());
try {
const result = await requestFn();
resolve(result);
} catch (error) {
reject(error);
}
}
this.processing = false;
}
}
// Usage
const limiter = new RateLimiter(60); // 60 requests per minute
async function callAPIWithLimiter(request) {
return limiter.addRequest(() =>
fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: {
'apikey': process.env.WAYMAKER_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify(request)
})
);
}
4. Implement Circuit Breaker
Prevent cascading failures with circuit breaker pattern:
class CircuitBreaker {
constructor(threshold = 5, timeout = 60000) {
this.failureThreshold = threshold;
this.timeout = timeout;
this.failures = 0;
this.state = 'CLOSED'; // CLOSED, OPEN, HALF_OPEN
this.nextAttempt = Date.now();
}
async call(requestFn) {
if (this.state === 'OPEN') {
if (Date.now() < this.nextAttempt) {
throw new Error('Circuit breaker is OPEN');
}
this.state = 'HALF_OPEN';
}
try {
const response = await requestFn();
if (response.status === 429) {
this.recordFailure();
throw new Error('Rate limit exceeded');
}
this.recordSuccess();
return response;
} catch (error) {
this.recordFailure();
throw error;
}
}
recordSuccess() {
this.failures = 0;
this.state = 'CLOSED';
}
recordFailure() {
this.failures++;
if (this.failures >= this.failureThreshold) {
this.state = 'OPEN';
this.nextAttempt = Date.now() + this.timeout;
console.log(`Circuit breaker OPEN. Retry at ${new Date(this.nextAttempt)}`);
}
}
}
// Usage
const breaker = new CircuitBreaker(5, 60000);
async function callAPIWithBreaker(request) {
return breaker.call(() =>
fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: {
'apikey': process.env.WAYMAKER_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify(request)
})
);
}
Best Practices
1. Cache Responses
Reduce API calls by caching common responses:
class APICache {
constructor(ttl = 5 * 60 * 1000) {
this.cache = new Map();
this.ttl = ttl; // 5 minutes default
}
async get(key, fetcher) {
const cached = this.cache.get(key);
if (cached && Date.now() - cached.timestamp < this.ttl) {
console.log('Cache hit:', key);
return cached.data;
}
console.log('Cache miss:', key);
const data = await fetcher();
this.cache.set(key, { data, timestamp: Date.now() });
return data;
}
clear() {
this.cache.clear();
}
}
// Usage
const cache = new APICache(5 * 60 * 1000);
const response = await cache.get('framework-basics', () =>
fetch('https://api.waymakerone.com/v1/framework', {
method: 'POST',
headers: {
'apikey': process.env.WAYMAKER_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
messages: [{ role: 'user', content: 'Explain Business Model Canvas' }]
})
})
);
2. Batch Related Requests
Combine related questions in conversation threads:
// ❌ BAD: Multiple separate requests
await fetch('/v1/framework', {
body: JSON.stringify({
messages: [{ role: 'user', content: 'Help with strategy canvas' }]
})
});
await fetch('/v1/framework', {
body: JSON.stringify({
messages: [{ role: 'user', content: 'How to identify key partnerships?' }]
})
});
// ✅ GOOD: Single conversation thread
await fetch('/v1/framework', {
body: JSON.stringify({
messages: [
{ role: 'user', content: 'Help with strategy canvas' },
{ role: 'assistant', content: '[Previous response...]' },
{ role: 'user', content: 'How to identify key partnerships?' }
],
sessionId: 'session_123'
})
});
3. Monitor Rate Limit Usage
Track usage to avoid hitting limits:
class RateLimitMonitor {
constructor() {
this.requests = [];
}
recordRequest(response) {
const limit = parseInt(response.headers.get('X-RateLimit-Limit'));
const remaining = parseInt(response.headers.get('X-RateLimit-Remaining'));
const reset = parseInt(response.headers.get('X-RateLimit-Reset'));
this.requests.push({
timestamp: Date.now(),
limit,
remaining,
reset
});
const percentage = ((limit - remaining) / limit) * 100;
if (percentage > 80) {
console.warn(`Rate limit: ${percentage.toFixed(1)}% used`);
}
if (percentage > 95) {
console.error('Critical: Nearly at rate limit!');
}
}
getStats() {
const now = Date.now();
const lastMinute = this.requests.filter(r => now - r.timestamp < 60000);
return {
requestsLastMinute: lastMinute.length,
averageRemaining: lastMinute.reduce((sum, r) => sum + r.remaining, 0) / lastMinute.length
};
}
}
// Usage
const monitor = new RateLimitMonitor();
const response = await fetch('https://api.waymakerone.com/v1/coordinate', {
method: 'POST',
headers: { 'apikey': process.env.WAYMAKER_API_KEY },
body: JSON.stringify(request)
});
monitor.recordRequest(response);
4. Use Webhooks for Long Operations
For long-running operations, use webhooks instead of polling:
// ❌ BAD: Polling wastes rate limit
async function pollForResult(operationId) {
while (true) {
const response = await fetch(`/v1/operations/${operationId}`);
if (response.status === 200) return await response.json();
await new Promise(resolve => setTimeout(resolve, 1000));
}
}
// ✅ GOOD: Webhook callback
await fetch('/v1/operations', {
method: 'POST',
body: JSON.stringify({
operation: 'complex_analysis',
webhookUrl: 'https://yourapp.com/webhook/operations'
})
});
// Your webhook endpoint receives result when ready
app.post('/webhook/operations', (req, res) => {
const result = req.body;
processResult(result);
res.status(200).send('OK');
});
Troubleshooting
Consistent 429 Errors
If you're consistently hitting rate limits:
Check:
- Request frequency - Are you making too many requests per minute?
- Concurrent requests - Are you exceeding 10 simultaneous requests?
- Multiple keys - Rate limits apply per organization, not per key
- Retry logic - Are failed requests causing retry storms?
Solutions:
- Implement request queuing (see examples above)
- Add caching for common responses
- Batch related requests in conversations
- Contact sales for enterprise limits if needed
Intermittent 429 Errors
If you occasionally hit rate limits:
Possible causes:
- Traffic spikes during peak usage
- Batch processing jobs
- User-triggered request floods
Solutions:
- Implement exponential backoff (see examples above)
- Add circuit breaker pattern
- Spread batch jobs across time
- Add rate limiting to user-facing features
Rate Limit Resets Not Working
If waiting doesn't resolve rate limits:
Check:
- Using correct
resetTimefrom error response - System clock is accurate (sync with NTP)
- Multiple processes sharing same API key
Solution:
// Ensure you're using the correct reset time
const resetTime = parseInt(response.headers.get('X-RateLimit-Reset'));
const waitTime = (resetTime * 1000) - Date.now();
if (waitTime > 0) {
console.log(`Waiting ${waitTime}ms until reset`);
await new Promise(resolve => setTimeout(resolve, waitTime));
}
Enterprise Rate Limits
When to Upgrade
Consider enterprise rate limits if:
- Consistently hitting 60 requests/minute limit
- Building high-volume production application
- Need guaranteed capacity during peak times
- Require SLA for uptime and performance
What You Get
Enterprise includes:
- Custom rate limits (500+ requests/minute)
- Dedicated capacity (no sharing with other orgs)
- Priority processing for all requests
- 99.9% uptime SLA
- Dedicated support team
Contact: sales@waymakerone.com
Next Steps
- Credit System - Understanding API costs
- Usage Tracking - Monitor consumption
- Auto Top-Up - Automatic credit purchases
- API Authentication - Getting started
Related Articles
Questions about rate limiting? → Contact support@waymakerone.com