# Changing cloud storage credentials and output location

{% hint style="info" %}
**Required dataset permission:** Edit dataset settings
{% endhint %}

For a cloud storage dataset, you can:

* Update the cloud storage credentials. Note that this option is only available if you provided the credentials manually. If you use the credentials set in environment variables, then you cannot change the credentials.
* Change the output location for the generated output files.

You configure the connection credentials and output location from the **Dataset settings** page. To display the **Dataset settings** page, on the dataset details page, click **Project** **settings**.

After you update the configuration, click **Save Dataset**.

## Changing cloud storage credentials <a href="#dataset-cloud-creds" id="dataset-cloud-creds"></a>

From the credentials section, to update the cloud storage credentials, click **Update >Cloud storage solution> Credentials**.

### **Amazon S3**

To provide updated credentials for Amazon S3:

1. In the **Access Key** field, provide an AWS access key that is associated with an IAM user or role. For an example of a role that has the required permissions for an Amazon S3 dataset, go to [#amazon-s3-example-iam-role](https://docs.tonic.ai/textual/textual-install-administer/configuring-textual/enable-and-configure-textual-features/pipelines-example-iam-roles#amazon-s3-example-iam-role "mention").
2. In the **Access Secret** field, provide the secret key that is associated with the access key.
3. From the **Region** dropdown list, select the AWS Region to send the authentication request to.
4. In the **Session Token** field, provide the session token to use for the authentication request.
5. To test the credentials, click **Test AWS Connection**.

### **Azure**

To provide updated credentials for Azure:

1. In the **Account Name** field, provide the name of your Azure account.
2. In the **Account Key** field, provide the access key for your Azure account.
3. To test the connection, click **Test Azure Connection**.

### **SharePoint**

SharePoint credentials must have the following application permissions (not delegated permissions):

* `Files.Read.All` -  To see the SharePoint files
* `Files.ReadWrite.All` -To write redacted files and metadata back to SharePoint
* `Sites.ReadWrite.All` - To view and modify the SharePoint sites

To provide updated credentials for SharePoint:

1. In the **Tenant ID** field, provide the SharePoint tenant identifier for the SharePoint site.
2. In the **Client ID** field, provide the client identifier for the SharePoint site.
3. In the **Client Secret** field, provide the secret to use to connect to the SharePoint site.
4. To test the connection, click **Test SharePoint Connection**.

## Setting the output location <a href="#dataset-output-location" id="dataset-output-location"></a>

The output location is where Textual writes the redacted files.

When you create a cloud storage database, after you select the initial set of files and folders, Textual prompts you to select the output location.

For an existing dataset, you set the output location from the **Output Location** section of the **Dataset settings** page.

Click the edit icon, then select the cloud storage folder where Textual writes the output files for the dataset.

When you generate output for a cloud storage dataset, Textual creates a folder in the output location. The folder name is the identifier of the job that generated the files.

Within the job folder, Textual recreates the folder structure for the original files.

Textual then writes the output files to the corresponding folders.
