# Using your own LLM key for AI usage

{% hint style="info" %}
Requires an Enterprise license.
{% endhint %}

{% hint style="info" %}
These instructions are for Fabricate Cloud only. Self-hosted instances must [provide their own LLM key during setup](/fabricate/self-hosting-fabricate/configuring-fabricate/llm-provider.md).

On Fabricate Cloud, before you can use this option, you must contact Tonic.ai to enable it for your account.

After Tonic.ai enables the option for you to use your own LLM key, then until you configure the LLM provider and key, Fabricate does not process any LLM requests.
{% endhint %}

Enterprise accounts can choose to provide an API key to use for Fabricate AI features. The  API key must be from one of the following LLM providers:

* Anthropic
* Amazon Bedrock
* Azure AI Foundry

When you provide your own LLM key, Tonic.ai does not charge your for AI usage, only for basic access. However, any [configured limit on AI usage cost](/fabricate/about-fabricate/fabricate-pricing-and-ai-usage/tracking-and-limiting-ai-usage-and-cost.md#limiting-your-ai-usage-overage) still applies.

## Overview of the flow with your own LLM key <a href="#flow-overview-customer-llm-key" id="flow-overview-customer-llm-key"></a>

The following diagram provides an overview of the flow of requests and data when you provide your own LLM key, in this case Amazon Bedrock.

<figure><img src="/files/KXcOldXJeT5yWdfXjk7h" alt=""><figcaption><p>Diagram of the flow of requests and data when you use your own LLM key</p></figcaption></figure>

## Enabling the option to provide your own key

To enable the option to use your own LLM key, you must contact Tonic.ai.

After Tonic.ai configures your account to require a custom LLM key, then before you can use Fabricate, you must provide the LLM key.

## Selecting the LLM provider

On the **AI** page of **Account Settings**, you set up the LLM provider in **LLM Provider Settings**.

<figure><img src="/files/9kKNj1d6nOebJsLHRPgC" alt=""><figcaption><p>LLM Provider Settings for an Enterprise account</p></figcaption></figure>

From the **Provider** dropdown list, select the LLM provider to use.

You can use either:

* Anthropic
* Amazon Bedrock
* Azure AI Foundry

## Providing the LLM provider API key

### Anthropic

If you choose Anthropic, then in the **Anthropic API Key** field, provide your Anthropic key.

### Amazon Bedrock

If you choose Amazon Bedrock, then:

1. In the **Amazon Bedrock API Key** field, provide your Amazon Bedrock key.
2. In the **AWS Region** field, specify the AWS Region where your account is located.

### Azure AI Foundry

If you choose Azure AI Foundry, then:

1. In the **Azure AI Foundry API Key** field, provide the API key for your Azure AI Foundry endpoint.
2. In the **Azure AI Foundry Endpoint URL** field, provide the base URL for your Azure AI Foundry Anthropic-compatible endpoint.

## Testing and saving the LLM provider key

To test the provided key, click **Test Credentials**.

To save the configuration, click **Save Changes**.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.tonic.ai/fabricate/about-fabricate/fabricate-pricing-and-ai-usage/using-your-own-llm-key-for-ai-usage.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
