Connecting a self-hosted instance to an LLM

On a self-hosted instance, before you can use Tonic Structural AI features, you must provide an API key for a supported LLM provider.

To configure the connection to the LLM provider, use the following environment settings. You can add these settings manually to the Environment Settings list on Structural Settings.

Identifying the LLM provider

First, to identify a supported LLM provider to use for Structural AI features:

  • TONIC_LLM_PROVIDER - By default, this is not set, and the AI features are not available. The available values are:

    • OpenAI

Providing connection details for OpenAI

When TONIC_LLM_PROVIDER is OpenAI, to provide the required connection information:

  • TONIC_OPENAI_ENDPOINT - The OpenAI endpoint URL to use for AI-enhanced features. Should include the endpoint base URL for the OpenAI API, without the version. Default: https://api.openai.com Example URL for Microsoft foundry: https://my-foundry-instance.ai.azure.com/openai

  • TONIC_OPENAI_API_KEY - The API key for OpenAI.

Note that if you use a hosted OpenAI deployment such as Microsoft Foundry, Structural requires the following model deployments. The deployment name must match the model name.

  • gpt-4.1-mini

  • gpt-4.1

Last updated

Was this helpful?