Skip to content

AI Providers

Configure AI model providers for Astonish

Astonish supports 15+ AI providers. The easiest way to configure a provider is the interactive setup wizard:

Terminal window
astonish setup

You can also add and manage providers through Studio Settings > Providers.

ProviderType KeyDescription
OpenAIopenaiGPT-4o, GPT-4, GPT-3.5, o1, o3
AnthropicanthropicClaude 4, Claude 3.5 Sonnet
Google GeminigeminiGemini 2.5 Pro, Flash
AWS BedrockbedrockAccess Anthropic, Meta, etc. via AWS
Azure OpenAIazureOpenAI models via Azure
OllamaollamaLocal models (Llama, Mistral, etc.)
OpenRouteropenrouterMulti-provider routing
GroqgroqUltra-fast inference
DeepSeekdeepseekDeepSeek models
FireworksfireworksFast inference platform
CerebrascerebrasFast inference
TogethertogetherOpen-source model hosting
MistralmistralMistral models
xAIxaiGrok models
LM Studiolm_studioLocal model server
LiteLLMlitellmUniversal LLM proxy
SAP AI Coresap_ai_coreSAP enterprise AI
PoepoePoe.com models
OpenAI Compatibleopenai_compatAny OpenAI-compatible API

If you prefer to edit the config file directly, providers are defined in config.yaml:

general:
default_provider: my-openai
default_model: gpt-4o
providers:
my-openai:
type: openai
api_key: "sk-..."
model: gpt-4o
local-ollama:
type: ollama
base_url: "http://localhost:11434"
model: llama3.1
my-anthropic:
type: anthropic
api_key: "sk-ant-..."

Run astonish setup for an interactive walkthrough that configures your provider and stores credentials securely. You can re-run it at any time to add more providers or change settings.

You can configure several providers and switch between them:

  • Use the --provider flag on the CLI
  • Change the active provider in Studio settings

API keys are automatically scrubbed from the config file into the encrypted credential store after initial setup. You do not need to keep plaintext keys in the config file.

The openai_compat type works with any OpenAI-compatible API endpoint, including vLLM, text-generation-inference, and other local or remote servers.