LLM used on Structural Cloud
For its AI-enhanced features, Structural Cloud uses Opus 4.6, Sonnet 4.5, and Haiku 4.5 deployed on Microsoft Foundry.
How information is used by the LLM
Our usage of information follows the Anthropic usage policy.
Your input prompts and generated outputs remain private. They are never used to train the underlying models. However, as a best practice, we recommend that you do not include highly sensitive values in your manual prompts.
To deliver high-quality suggestions, Structural sends to the model as context:
Your database schema.
By default, representative data samples. However, you can configure Structural to exclude the sample data.
For answers to more detailed questions, reach out to our support team.
Last updated
Was this helpful?