AI Providers and API Keys

PebbleFlow connects to 500+ AI models from different providers. You bring your own API keys—that means you pay the providers directly, and PebbleFlow never touches your spending.

Supported providers: OpenRouter (the default), Anthropic, Google Gemini, OpenAI, Poe, Ollama, and MLX (Apple Silicon). You can also access hundreds of additional models (Mistral, Groq, Llama, and more) through OpenRouter. See Terms of Use for policies.

Get Started (Two Options)

Option 1: Use the Free Provisioned Key

We provide a free API key that lets you try PebbleFlow immediately. It has access to select free models with usage limits—perfect for testing things out.

To keep using it: just open PebbleFlow and start chatting. No setup needed.

Option 2: Bring Your Own Key (BYOK)

Want 500+ models to choose from? Add your own API key.

  1. Go to Settings > AI Provider
  2. Select your provider (OpenRouter, Anthropic, Google, OpenAI, Poe, Ollama, or MLX)
  3. Paste your API key (or set your Ollama server URL for local models)
  4. That's it—keys stay on your device and never leave your machine

Model Configuration showing Default Model and Max Steps

Quick Tasks Model

Quick tasks (like summarize, translate, or explain) can use a different model than your default. To set one, open the model picker dropdown in the header and choose a quick tasks model from there. If not set, quick tasks use your default model.

Where to Get API Keys

Provider Why Choose It Get Key At
OpenRouter (default) 500+ models in one place (Claude, GPT, Gemini, Llama, and more) openrouter.ai
Google Direct access to Gemini models ai.google.dev
Anthropic Direct access to Claude models console.anthropic.com
OpenAI Direct access to GPT and DALL-E models platform.openai.com
Poe Access multiple AI models through Poe's API poe.com
Ollama Run open-source models locally on your machine — fully private ollama.com
MLX (Apple Silicon) Run models locally on Mac via MLX, fully private No key needed — install MLX models locally

OpenRouter is the default choice—PebbleFlow's tools expect OpenRouter's model registry, and a single OpenRouter key unlocks the entire catalog. Direct provider keys (Anthropic, Google, OpenAI) and local options (Ollama, MLX) work as alternatives when you want provider-specific features or fully-offline inference.

API Key Security

Your API keys are stored locally on your device. They're never sent to PebbleFlow servers. When you make a request, your key goes directly to the AI provider. See Privacy & Data for the full story.

Performance Depends on Your Model

PebbleFlow handles orchestration — tools, context, multi-step workflows. But the quality of the AI's responses comes down to which model you choose.

For complex tasks (research, analysis, long documents): Use frontier models like Claude Opus/Sonnet, GPT-4.1, or Gemini 2.5 Pro. They handle multi-step reasoning and tool use best.

For quick answers (questions, summaries, simple edits): Fast models like Claude Haiku, GPT-4.1 mini, or Gemini 2.5 Flash are cheaper and faster. Perfect for everyday tasks.

For experimenting: OpenRouter lets you try 500+ models with one API key. Find what works best for your workflow without committing to a single provider.

PebbleFlow's tools, context layers, and orchestration amplify whatever model you pick. A good model becomes great when it has the right context and tools to work with.

See Also


This guide is maintained by the PebbleFlow team using Slate, our built-in editor.