Configure generative AI in Baserow

Baserow integrates with multiple AI providers (OpenAI, Anthropic, Ollama, OpenRouter, Mistral) to power the AI field and generative features.

What you need to know

Baserow’s generative AI configuration lets you connect your preferred AI models directly to your workspace.

Baserow Self-Hosted users can configure at workspace-level for team-specific models, or instance-level for organization-wide settings.

Baserow AI field configuration interface

Configure API keys

At instance level

For self-hosted Baserow, configure API keys globally using environment variables. This applies settings across all workspaces in your instance.

At workspace level

  1. Navigate to the workspace dashboard to access the settings
  2. Select Generative AI
  3. Enter your API key and model preferences
  4. Settings apply to all workspace members

Workspace-level configuration overrides instance defaults. Leave fields blank to use instance-level settings.

AI field tutorial

Supported AI providers

Provider Required Configuration Example Models
OpenAI API key + Organization (optional) + Enabled Models gpt-3.5-turbo, gpt-4
Anthropic API key + Enabled Models claude-3-haiku-20240307, claude-opus-4-20250514
Ollama Host + Enabled Models llama2, mistral
Mistral API key + Enabled Models mistral-large-latest, mistral-small-latest
OpenRouter API key + Organization (optional) + Enabled Models openai/gpt-4o,anthropic/claude-3-haiku

OpenAI setup

Provide:

  • OpenAI API key for authentication. Without an API key, users cannot select OpenAI models.
  • OpenAI organization name (optional)
  • Comma-separated list of OpenAI models to enable (e.g., gpt-3.5-turbo, gpt-4-turbo-preview). Note that this only works if an OpenAI API key is set. If this variable is not provided, the user won’t be able to choose a model.

Anthropic setup

Provide:

OpenRouter setup

Provide:

  • OpenRouter API key for authentication.
  • Optionally provide an Open Router organization name that will be used when making an API connection.
  • Provide a comma-separated list of Open Router models that you would like to enable in the instance (e.g. openai/gpt-4o, anthropic/claude-3-haiku). Note that this only works if an API key is set. If this variable is not provided, the user won’t be able to choose a model.

Ollama setup

Provide:

  • Ollama host URL. Provide the hostname to your Ollama server. This typically runs locally on your own device.
  • Comma-separated list of Ollama models to enable (e.g., llama2)

Ollama enables local model hosting for enhanced data privacy.

Mistral setup

Provide:

Frequently asked questions

What AI models are available on Baserow Cloud?

Baserow integrates with multiple AI providers (OpenAI, Anthropic, Ollama, OpenRouter, Mistral) to power the AI field and generative features.

Can I use multiple AI providers simultaneously?

Yes. Configure multiple providers at workspace or instance level. Users can select their preferred model when creating AI fields.

Where do I find my API keys?

What happens if I don’t configure an API key for self-hosted Baserow?

AI fields will be disabled until you add at least one provider’s API key at workspace or instance level.

Do workspace-level settings override instance settings?

Yes. Workspace-level API keys and model selections take precedence over instance defaults. Leave workspace fields blank to inherit instance configuration.


Need help? Visit the Baserow community or contact support for assistance with your account.