# LLM used on Structural Cloud

For its AI-enhanced features, Structural Cloud uses Opus 4.6, Sonnet 4.5, and Haiku 4.5 deployed on [Microsoft Foundry](https://azure.microsoft.com/en-us/products/ai-foundry).

## How information is used by the LLM <a href="#llm-information-use" id="llm-information-use"></a>

Our usage of information follows the [Anthropic usage policy](https://www.anthropic.com/legal/aup).

Your input prompts and generated outputs remain private. They are never used to train the underlying models. However, as a best practice, we recommend that you do not include highly sensitive values in your manual prompts.

To deliver high-quality suggestions, Structural sends to the model as context:

* Your database schema.
* By default, representative data samples. However, you can [configure Structural to exclude the sample data](#llm-detection-exclude-sample-data).

For answers to more detailed questions, reach out to our support team.
