Snowflake on AWS


Tonic supports moving data from one database to another within a single Snowflake instance and can also move data between Snowflake instances. In both situations, Tonic uses S3 as an intermediate stage to host both real and masked data.

Diagram of the Tonic ETL process on Snowflake

Below is a high level diagram describing how Tonic orchestrates the processing and moving of data in Snowflake. This diagram should not be confused with Tonics architectural diagram.
Diagram of data being moved and processed in AWS
Tonic orchestrates the moving and transforming of data between Snowflake databases hosted in AWS. AWS services S3, SQS, and Lambda are used to accomplish this task. Tonic manages the lifetimes of data and resources used in AWS and only requires the necessary permissions be assigned to the IAM role being used by Tonic. The process at a high level is:
  1. 1.
    Create an AWS Lambda function for your version of Tonic. This step is performed once per version of Tonic. AWS Lambda function creation happens when you run your first job after a new installation or after a version upgrade.
  2. 2.
    Create AWS SQS queue and S3 Event Triggers. This is done once per job and the resource names are scoped to your specific generation job.
  3. 3.
    COPY table data into S3. The S3 bucket path is specific in the Tonic UI.
  4. 4.
    As files land in S3 this causes S3 Event notifications to place messages in SQS. Messages in SQS trigger Lambda function invocations. By default, each file placed in S3 has a maximum file size of 16MB and each lambda invocation will process a single file. Lambda processes each file and writes them back to S3 in a different location, also specified by user in the UI.
  5. 5.
    Once all files for a table are processed Tonic copies data back into Snowflake, into the destination database.
  6. 6.
    Once all tables are processed Tonic removes ephemeral AWS components such as SQS and S3 Event Notifications are removed.


Snowflake support includes all Standard Tonic features with the exception of Subsetting which is not currently supported.
If you'd like to use Subsetting in conjunction with Snowflake, reach out to [email protected]