Skip to content

netclaw provider

Manage the LLM providers that netclaw talks to. Run netclaw provider for an interactive TUI, or use subcommands to script provider setup.

If you haven’t run netclaw init yet, start there — it configures your first provider.

Terminal window
netclaw provider # launch TUI
netclaw provider <subcommand> [options] # CLI mode

Provider Manager TUI showing configured providers with health status

On launch, the TUI probes every configured provider and shows health status:

IndicatorMeaning
Healthy, models discovered
Unreachable or auth failure
Probe in progress

Select a provider to view details (type, auth, endpoint, model count) or take action:

KeyAction
/ Navigate
EnterSelect / open details
KUpdate API key (details view)
RRemove provider (details view)
VRe-validate connection (details view)
EscBack / quit

The sentinel row + Add new provider... starts an interactive add flow. Netclaw validates connectivity with a 20-second timeout and reports how many models it found.

OpenAI OAuth is only available through this TUI flow — select Add, choose OpenAI, then pick “ChatGPT Subscription” to authenticate with your existing account.

Terminal window
netclaw provider list
Name Provider Auth Endpoint
my-anthropic Anthropic ApiKey https://api.anthropic.com
my-ollama Ollama None http://localhost:11434

This shows static config only — no live health probing. Open the TUI to see real-time provider health.

Terminal window
netclaw provider add <name> <type> [--api-key <key>] [--endpoint <url>]
FlagDescriptionDefault
--api-key <key>API key for the providerPrompted if required
--endpoint <url>Custom endpoint URLProvider default

Provider type and endpoint are stored in ~/.netclaw/config/netclaw.json. Credentials are encrypted in secrets.json. Restart the daemon after adding a provider so it picks up the new config.

Terminal window
netclaw provider remove <name>

Netclaw blocks removal if any model role (Main, Fallback, or Compaction) references the provider:

Error: Cannot remove provider 'my-anthropic' — referenced by model role(s): Main, Fallback
Run `netclaw model set` to reassign these roles first, or `netclaw model clear` for optional roles.

Reassign models first with netclaw model, then remove.

Pick Anthropic or OpenAI for hosted models, Ollama for fully local inference, or OpenRouter for access to models from multiple vendors through a single key.

TypeDisplay NameDefault EndpointAuth
ollamaOllamahttp://localhost:11434None
openai-compatiblellama.cpp / vLLMhttp://localhost:11434None
openaiOpenAIhttps://api.openai.comOAuth or API key
anthropicAnthropichttps://api.anthropic.comAPI key
openrouterOpenRouterhttps://openrouter.ai/api/v1API key
Terminal window
# Local Ollama on a remote GPU server
netclaw provider add my-ollama ollama --endpoint http://my-gpu-server:11434
# Anthropic with an API key
netclaw provider add my-anthropic anthropic --api-key sk-ant-...
# OpenAI with an API key
netclaw provider add my-openai openai --api-key sk-proj-...
# OpenRouter
netclaw provider add my-openrouter openrouter --api-key sk-or-...
# llama.cpp or vLLM behind an OpenAI-compatible endpoint
netclaw provider add my-llama openai-compatible --endpoint http://localhost:8080
# Remove a provider
netclaw provider remove my-ollama

After adding a provider, assign it to a model role with netclaw model set.

Override API keys with environment variables

Section titled “Override API keys with environment variables”

Skip config files entirely by setting an environment variable:

Terminal window
export NETCLAW_Providers__my-anthropic__ApiKey="sk-ant-..."

Double underscores (__) separate config path segments. The NETCLAW_ prefix is required.