← Back to Documentation

Providers

PersistenceAI supports multiple LLM providers. You can configure API keys for any provider you want to use.

Connecting a Provider

/connect

Supported Providers

PersistenceAI supports all major LLM providers:

Provider Configuration

Provider configurations are stored in your global config file:

Example Configuration

{
  "providers": {
    "openai": {
      "apiKey": "sk-..."
    },
    "anthropic": {
      "apiKey": "sk-ant-..."
    },
    "ollama": {
      "baseURL": "http://localhost:11434"
    }
  }
}

Ollama (Self-Hosted)

PersistenceAI has excellent support for Ollama, allowing you to run models locally without API costs.

Setup

ollama pull llama2
   ollama pull codellama
/connect

Select "Ollama" and enter your base URL (default: http://localhost:11434)

Recommended Models

PersistenceAI Zero

PersistenceAI Zero is a curated list of models that have been tested and verified by the PersistenceAI team.

Model Selection

You can switch between models using:

Provider-Specific Notes

OpenAI

Anthropic

Google

Ollama


For more information, see: