Structural process overviews for Snowflake
The following high-level diagrams describe how Tonic Structural orchestrates the processing and moving of data in Snowflake.
Snowflake on AWS
Structural manages the lifetimes of data and resources used in AWS. It only requires you to assign the necessary permissions to the IAM role that Structural uses.
By default, Structural uses the following data generation process:

At a high level:
Structural copies the table data into either an S3 bucket or an external stage as CSV files. You specify the S3 bucket path or stage in the Structural workspace configuration. If you use a single location for both source and destination data, Structural copies the data files into an
input
folder.Structural applies the configured generators to the data in the files, then writes the resulting files to an S3 bucket or external stage. If you use a single location for both source and destination data, Structural copies the data files into an
output
folder.After it processes all of the files, Structural copies the data from the S3 bucket or external stage into the Snowflake destination database.
Snowflake on Azure
Structural orchestrates the moving and transforming of data between Snowflake databases that are hosted on Azure. Structural uses Azure Blob Storage for interim storage of files that contain the source and destination data.

At a high level, the data generation process is:
Structural copies the table data from the Snowflake database to files in Azure Blob Storage. You specify the container path in the Structural workspace configuration. Structural places the files in an input folder within the container path.
Structural applies the configured generators to the data in the files, then writes the resulting files to an output folder in the container path.
As it finishes processing each file, Structural copies the data from the container path’s output folder into the Snowflake destination database.
Last updated
Was this helpful?