Frequently Asked Questions
Everything you need to know about AI Supreme Council.
What is AI Supreme Council?
AI Supreme Council is a browser-based platform that lets you chat with multiple AI models (Claude, GPT, Grok, Gemini, and 300+ others) from a single interface. In Council mode, multiple models debate your question through a three-phase deliberation process — fan-out, peer review, and synthesis — to deliver a cross-checked consensus answer.
Is it really free?
Yes. The free tier gives you access to free models from OpenRouter and Google Gemini with no credit card required. You can also bring your own API keys (BYOK) to use any model on the free plan. The optional Lite subscription ($3-9/month) adds premium features.
Do you store my conversations?
No. All conversations are stored locally in your browser's IndexedDB. There is no backend server processing or storing your chats. Your messages go directly from your browser to each AI provider's API. We never see, intercept, or store your conversations.
What is BYOK (Bring Your Own Key)?
BYOK means you use your own API keys from providers like Anthropic, OpenAI, xAI, Google, or OpenRouter. Your keys are stored in your browser's localStorage and sent directly to the provider — we never see or store them. You pay each provider directly at their rates.
How does Council mode work?
Council mode runs a three-phase deliberation: (1) Fan-out — your question is sent to multiple models in parallel and each produces an independent response; (2) Peer review — each model reviews the other models' answers, identifying agreements, errors, and missing perspectives; (3) Synthesis — a chairman model produces a final consensus answer noting where models agreed and diverged.
What AI models are supported?
Claude (Anthropic), GPT and o-series (OpenAI), Grok (xAI), Gemini (Google), 300+ models via OpenRouter, and any model you run locally with Ollama. The platform is extensible — new providers can be registered programmatically.
How is my data protected?
AI Supreme Council uses a zero-server architecture. There is no backend server, no database, and no application server to breach. Your data stays in your browser. API keys never leave your device except to go directly to the AI provider. All connections use HTTPS/TLS encryption. Bot sharing URLs use the URL fragment (after the #), which browsers never send to servers.
How does bot sharing work?
Bot configurations (name, model, system prompt, temperature) are encoded into the URL fragment using Base80 compression. The URL fragment is the part after the # symbol, which browsers never send to servers. When someone opens a shared URL, they get your exact bot configuration — they just need to add their own API key.
Can I use local AI models?
Yes. AI Supreme Council supports Ollama for running local LLMs on your own hardware. No API key is required. Ollama models are auto-detected when running locally, and you can configure a custom endpoint in Settings. This gives you complete privacy since no data leaves your machine.
How does geo-tiered pricing work?
The Lite subscription price ranges from $3 to $9 per month (USD equivalent) based on your country's cost of living. Your pricing tier is determined automatically via Cloudflare's country detection at the time of subscription. This ensures fair pricing worldwide. Payments are processed by Stripe and PayPal.