One API.
Every Model.
Zool unifies Claude, GPT-4, Gemini, and more behind a single interface. Automatic fallback. Intelligent routing. Zero vendor lock-in.
What happens when Claude is down? When GPT-4 rate limits you? Zool automatically fails over to the next best model. Your app keeps running. Your users never know.
Why Zool?
A gateway that makes LLM integration painless
Automatic Fallback
If Claude is down or rate-limited, requests automatically route to GPT-4 or Gemini. Zero downtime.
Intelligent Routing
Route coding tasks to Claude, creative writing to GPT-4, long context to Gemini. Use the best model for each task.
Cost Optimization
Route simple queries to cheaper models. Save Claude for complex reasoning. Cut costs by 60%.
Unified Analytics
One dashboard for all providers. Track costs, latency, and usage across models.
Simple Integration
Drop-in replacement for OpenAI SDK. Switch from direct API to Zool in minutes.
No Lock-in
Add new providers, remove old ones. Your code stays the same.
Simple to Use
Familiar API, powerful features underneath
import { Zool } from '@alpha-pm/zool'; const zool = new Zool({ // Configure providers providers: ['claude', 'openai', 'gemini'], // Automatic fallback order fallbackOrder: ['claude', 'openai', 'gemini'], }); // Use like any LLM client const response = await zool.chat({ messages: [{ role: 'user', content: 'Explain recursion' }], // Optional: prefer specific model prefer: 'claude', // Optional: routing hints hints: { task: 'coding' } }); // Response includes metadata console.log(response.provider); // 'claude' console.log(response.content); // '...' console.log(response.cost); // 0.002
Right Model, Right Task
Automatic routing based on task type
Ready for Multi-Model?
Zool brings reliability and flexibility to your LLM integration.