# Connecting a self-hosted instance to an LLM

On a self-hosted Structural instance, before you can use Structural AI features, you must provide an API key for a supported LLM provider. The supported providers are:

* Anthropic. **We strongly recommend Anthropic**, because Structural's AI features are optimized for Claude models.
* OpenAI. Note that we are continuing to test OpenAI for the Structural Agent. Structural supports any OpenAI deployment that:
  * Supports API key authentication.
  * Supports the OpenAI Responses API.
  * Supports OpenAI proprietary models, such as GPT-4/5.x.

To configure the connection to the LLM provider, use the following [environment settings](https://docs.tonic.ai/app/admin/environment-variables-setting). You can add these settings manually to the **Environment Settings** list on **Structural Settings**.

## Identifying the LLM provider <a href="#self-hosted-llm-provider" id="self-hosted-llm-provider"></a>

First, to identify a supported LLM provider to use for Structural AI features:

* `TONIC_LLM_PROVIDER` - By default, this is not set, and the AI features are not available.\
  \
  The available values are:
  * `Anthropic`
  * `OpenAI`

## Providing connection details for Anthropic <a href="#llm-anthropic-connection-details" id="llm-anthropic-connection-details"></a>

When `TONIC_LLM_PROVIDER` is `Anthropic`, to provide the required connection information:

* `TONIC_ANTHROPIC_ENDPOINT` - The Anthropic endpoint URL for generative AI services. Should include the endpoint base URL for the Anthropic API, without the version. \
  \
  **Default:** `https://api.anthropic.com`
* `TONIC_ANTHROPIC_API_KEY` - The Anthropic API key for authentication. In production environments, make sure that this is kept secure and encrypted.&#x20;

If you use a hosted Anthropic deployment, Structural requires the following models:

* claude-opus-4-6
* claude-sonnet-4-5
* claude-haiku-4-5

## Providing connection details for OpenAI <a href="#llm-openai-connection-details" id="llm-openai-connection-details"></a>

When `TONIC_LLM_PROVIDER` is `OpenAI`, to provide the required connection information:

* `TONIC_OPENAI_ENDPOINT` - The OpenAI endpoint URL to use for AI-enhanced features. Should include the endpoint base URL for the OpenAI API, without the version.\
  \
  **Default:** `https://api.openai.com`\
  \
  **Example URL  for Microsoft foundry:** `https://my-foundry-instance.ai.azure.com/openai`
* `TONIC_OPENAI_API_KEY` - The API key for OpenAI.

Note that if you use a hosted OpenAI deployment such as Microsoft Foundry, Structural requires the following model deployments. The deployment name must match the model name.

* gpt-4.1-mini
* gpt-4.1
* gpt-5.2
