Self-hosted AI

circle-check

Self-hosted AI

By default, AI features in Forest Admin are processed by Forest Admin servers. To keep your data private, you can route AI requests through your own agent using the addAi method—your data never leaves your infrastructure.

Configuration

const agent = createAgent(options).addDataSource(/* ... */).addAi({
  name: 'my-assistant',
  provider: 'openai',
  apiKey: process.env.OPENAI_API_KEY,
  model: 'gpt-4o',
});

Custom LLM endpoint

You can point to any OpenAI-compatible API (self-hosted LLM, Azure OpenAI, etc.) using the configuration option:

.addAi({
  name: 'my-assistant',
  provider: 'openai',
  apiKey: process.env.LLM_API_KEY,
  model: 'my-model',
  configuration: {
    baseURL: 'https://my-llm.example.com/v1',
  },
});

Options

Option
Type
Required
Description

name

string

Yes

Unique identifier for this AI configuration

provider

string

Yes

AI provider ('openai')

model

string

Yes

Model to use (e.g., 'gpt-4o', 'o3-mini')

apiKey

string

No

API key (defaults to OPENAI_API_KEY env var)

configuration

ClientOptions

No

OpenAI client options (baseURL, timeout, defaultHeaders)

All OpenAI client optionsarrow-up-right are supported.

Supported models

Any OpenAI-compatible model with function calling support works. Legacy models without function calling (gpt-4, gpt-3.5-turbo, completion models) are not supported.

Last updated