# Configuring the file connector storage type and output options

On the workspace details view for a file connector workspace, you:

* Identify the type of storage. After you add a file group to the workspace, you cannot change the storage type.
* Indicate where to write the transformed files.
* If needed, provide credentials to access the cloud storage.

## Identifying the type of storage

On the workspace creation view:

1. Under **Connection Type**, under **File/Blob Storage**, click **Files**.
2. Select the type of file storage where the source files are located.

   * To choose files from Amazon S3, click **Amazon S3**.
   * To choose files from MinIO, make sure that the `TONIC_AWS_S3_OVERRIDE_URL` [environment setting](https://docs.tonic.ai/app/admin/environment-variables-setting) points to your MinIO endpoint, then click **Amazon S3**.
   * To choose files from GCS, click **Google Cloud Storage**.
   * To upload files from a local file system, click **Local Filesystem**.
   * To choose files from a local file mount, click **File Mount**. The file mount option is not available on Structural Cloud.<br>

     For information on how to mount a volume, go to the following:

     * [Kubernetes documentation](https://kubernetes.io/docs/concepts/storage/volumes/)
     * [Docker documentation](https://docs.docker.com/engine/storage/volumes/#syntax)
     * [Amazon ECS documentation](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/docker-volumes.html)

     \
     In the **Source File Mount Path** field, provide the file mount path where the source files are located. The file mount path must be accessible by the container that runs the Structural application.

   After you add a file group to the workspace, you cannot change the storage type.

## Selecting the location for the transformed files

### Local files

When the source files come from a local file system, Tonic Structural writes the output files to the large file store in the Structural application database. You can then [download the most recently generated files](https://docs.tonic.ai/app/setting-up-your-database/file-connector/file-connector-download-generated-files).

### Cloud storage

For cloud storage workspaces, in the **Output location** field, provide the path to the folder where Structural writes the transformed files.

### File mount

For files that come from a local file mount, you can write the output files to one of the following:

* An S3 bucket
* Google Cloud Storage
* A file mount

#### S3 bucket or Google Cloud Storage

For S3 buckets and Google Cloud Storage, in the **Output location** field, provide the path to the folder where Structural writes the transformed files.

#### File mount

To write the output to a file mount:

1. By default, the files are written to the same file mount path where the source files are located. To use a different file mount path:
   1. Toggle **Set different mount for output** to the on position.
   2. In the **Destination File Mount Path** field, provide the file mount path. The file mount path must be accessible by the container that runs the Structural application.
2. In the **Output location** field, provide the location within the file mount where Structural writes the transformed files.

## Providing credentials to access AWS <a href="#file-connector-config-aws-credentials" id="file-connector-config-aws-credentials"></a>

For a file connector workspace that writes files to Amazon S3, under **AWS Credentials**, you configure how Structural obtains the credentials to connect to Amazon S3.

### Selecting the type of credentials to use <a href="#file-connector-aws-config-credentials-type" id="file-connector-aws-config-credentials-type"></a>

Under **AWS Credentials**, click the type of credentials to use. The options are:

* **Environment -** Only available on self-hosted instances.\
  \
  Indicates to use either:
  * The credentials for the IAM role on the host machine.
  * The credentials set in the following [environment settings](https://docs.tonic.ai/app/admin/environment-variables-setting):
    * `TONIC_AWS_ACCESS_KEY_ID` - An AWS access key that is associated with an IAM user or role.
    * `TONIC_AWS_SECRET_ACCESS_KEY` - The secret key that is associated with the access key.
    * `TONIC_AWS_REGION` - The AWS Region to send the authentication request to.
* **Assumed role -** Indicates to use the specified assumed role.
* **User credentials -** Indicates to use the provided user credentials.

### Providing an assumed role <a href="#file-connector-config-aws-assumed-role" id="file-connector-config-aws-assumed-role"></a>

To provide an assumed role, click **Assume role**, then:

1. In the **Role ARN** field, provide the Amazon Resource Name (ARN) for the role.
2. In the **Session Name** field, provide the role session name.\
   \
   If you do not provide a session name, then Structural automatically generates a default unique value. The generated value begins with `TonicStructural`.
3. In the **Duration (in seconds)** field, provide the maximum length in seconds of the session. \
   \
   The default is `3600`, indicating that the session can be active for up to 1 hour.\
   \
   The provided value must be less than the maximum session duration that is allowed for the role.

For each assumed role, Structural generates the external ID that is used in the assume role request. Your role’s trust policy must be configured to condition on your unique external ID.

Here is an example trust policy:

```json
{
  "Version": "2012-10-17",
  "Statement": {
    "Effect": "Allow",
    "Principal": {
      "AWS": "<originating-account-id>"
    },
    "Action": "sts:AssumeRole",
    "Condition": {
      "StringEquals": {
        "sts:ExternalId": "<external-id>"
      }
    }
  }
}
```

### Providing the AWS credentials

To provide the credentials, under **AWS Credentials**:

1. In the **AWS Access Key** field, enter the AWS access key that is associated with an IAM user or role.
2. In the **AWS Secret Key** field, enter the secret key that is associated with the access key.
3. In the **AWS Session Token** field, you can optionally provide a session token for a temporary set of credentials.
4. From the **AWS Region** dropdown list, select the AWS Region to send the authentication request to.

### Providing different credentials for the output location <a href="#file-connector-s3-output-credentials" id="file-connector-s3-output-credentials"></a>

By default, Structural uses the same AWS credentials to both retrieve the source files and write the output files.

To provide different AWS credentials for the output location:

1. Toggle **Set different credentials for output** to the on position.
2. Select the method to use to provide the output credentials.\
   \
   For the output credentials, you can use either an assumed role or provide the credentials manually.
3. Provide the the assumed role or credentials.

### Testing AWS credentials <a href="#file-connector-s3-test-credentials" id="file-connector-s3-test-credentials"></a>

To verify that Structural is able to connect to Amazon S3, click the **Test S3 Connection** button.

If you use different credentials for the source and output, then Structural provides a separate test option for each set of credentials.

* To test the source credentials, click **Test S3 Source Connection**.
* To test the output credentials, click **Test S3 Output Connection**.

## Providing credentials to access Google Cloud Storage <a href="#file-connector-config-gcs-credentials" id="file-connector-config-gcs-credentials"></a>

To write files to a folder in Google Cloud Storage, you must provide Google Cloud Platform credentials in the workspace configuration.

Under **GCP Credentials**:

1. For **Service Account File**, select the service account file (JSON file) for the source files.
2. In the **GCP Project ID** field, provide the identifier of the project that contains the source files.
3. To test the credentials, click **Test GCS Connection**.

## Providing credentials to access MinIO <a href="#file-connector-config-minio-credentials" id="file-connector-config-minio-credentials"></a>

When the [environment setting](https://docs.tonic.ai/app/admin/environment-variables-setting) `TONIC_AWS_S3_OVERRIDE_URL` points to a MinIO endpoint, then when you select **Amazon S3** as the source, you create a MinIO workspace.

Under **AWS credentials**, you provide the MinIO credentials. The MinIO credentials consist of an access key and a secret key.

### Selecting the option for providing the MinIO credentials <a href="#file-connector-minio-credentials-options" id="file-connector-minio-credentials-options"></a>

To provide the credentials, the options are:

* **Environment** - Only available on self-hosted only instances. Use the credentials set in the following [environment settings](https://docs.tonic.ai/app/admin/environment-variables-setting):
  * `TONIC_AWS_ACCESS_KEY_ID` - A MinIO access key
  * `TONIC_AWS_SECRET_ACCESS_KEY` - The secret key that is associated with the access key
* **User credentials** - Provide the access key and secret key manually

### Providing the MinIO credentials <a href="#file-connector-minio-credential-fields" id="file-connector-minio-credential-fields"></a>

To use the credentials from the environment settings, under **AWS Credentials**, click **Environment**.

To provide the credentials, under **AWS Credentials**:

1. In the **AWS Access Key** field, enter the MinIO access key.
2. In the **AWS Secret Key** field, enter the secret key that is associated with the access key.
3. By default, Structural uses the same credentials to both retrieve the source files and write the output files.\
   \
   To provide different MinIO credentials for the output location:
   1. Toggle **Set different credentials for output** to the on position.
   2. In the **AWS Access Key** field, enter the MinIO access key.
   3. In the **AWS Secret Key** field, enter the secret key that is associated with the access key.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.tonic.ai/app/setting-up-your-database/file-connector/file-connector-workspace-config.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
