# Connecting a self-hosted instance to an LLM

Self-hosted customers can choose whether to enable AI features on their instance.

To enable AI features, they must provide an endpoint and an API key for a supported LLM provider.

You configure the connection from the **AI Settings** tab on **Structural Settings**.

## Supported providers

The supported providers are:

* **Anthropic.** **We strongly recommend Anthropic**, because Structural's AI features are optimized for Claude models.<br>

  If you use a hosted Anthropic deployment, Structural requires the following models:

  * claude-opus-4-6
  * claude-sonnet-4-5
  * claude-haiku-4-5
* **OpenAI.** Note that we are continuing to test OpenAI for the Structural Agent. Structural supports any OpenAI deployment that:

  * Supports API key authentication.
  * Supports the OpenAI Responses API.
  * Supports OpenAI proprietary models, such as GPT-4/5.x.

  If you use a hosted OpenAI deployment such as Microsoft Foundry, Structural requires the following model deployments. The deployment name must match the model name.

  * gpt-4.1-mini
  * gpt-4.1
  * gpt-5.2

## Selecting and configuring a provider

To select your LLM provider:

<figure><img src="https://3378426797-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LSQCLFQ4bslJ-HYc8c3%2Fuploads%2Fyzqi3OThNf0YyOm7j8lk%2FAISettings.png?alt=media&#x26;token=c62997ba-66b7-4fa2-982b-d10243f42607" alt=""><figcaption><p>AI Settings page with a configured LLM provider</p></figcaption></figure>

1. From the **LLM Provider** dropdown list, select **API Key & Endpoint**.
2. Click the provider to use. We strongly recommend Anthropic, because Structural's AI features are optimized for Claude models.
3. In the **\<Provider> Endpoint** field, enter the endpoint URL for the LLM provider.\
   \
   For Anthropic, this is the Anthropic endpoint URL for generative AI services. Should include the endpoint base URL for the Anthropic API, without the version. The default is  `https://api.anthropic.com`.\
   \
   For OpenAI, this is the OpenAI endpoint URL to use for AI-enhanced features. Should include the endpoint base URL for the OpenAI API, without the version. The default is `https://api.openai.com`.  Here is an example URL for Microsoft Foundry: `https://my-foundry-instance.ai.azure.com/openai`
4. In the **\<Provider> API** **Key** field, enter the API key for the LLM provider.\
   \
   &#x20;For an Anthropic API key, in production environments, make sure that this is kept secure and encrypted.&#x20;
5. To test the connection, click **Test \<Provider> Connection**.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.tonic.ai/app/admin/structural-ai-use/connecting-a-self-hosted-instance-to-an-llm.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
