System requirements and limitations Limitations on Structural support for indexes and global tables.
Structural differences and limitations Features that are unavailable or work differently for the DynamoDB data connector.
Required DynamoDB configuration Required configuration in DynamoDB before you create a DynamoDB workspace.
Configure workspace data connections Data connection settings for DynamoDB workspaces.
Amazon DynamoDB is a document-based database.
For workspaces that use DynamoDB, the Tonic Structural application has the following differences.
On the following Structural views, the term "collection" replaces the term "table":
Privacy Hub
Schema Changes
References to columns are also replaced:
On Privacy Hub, the protection status panels refer to "fields" instead of "columns".
On the Schema Changes view, the change lists refer to "paths" instead of "columns".
For DynamoDB workspaces, Structural must scan each collection to determine the fields and data types within that collection. Until a scan is performed, you cannot configure the collection modes and generators.
For DynamoDB workspaces, Privacy Hub includes an additional Latest Collection Scan section that shows the most recent time that a scan was performed on each scanned collection.
For more information, go to Performing scans on collections.
For DynamoDB workspaces, there are no options to download a Privacy Report CSV or PDF.
DynamoDB workspaces do not support workspace inheritance.
For DynamoDB workspaces, there is no Database View or Table View. Instead, DynamoDB workspaces have a Collection View.
This view allows you to perform the same functions as Table View, but the display is more like Database View. For more information, go to Using Collection View.
Collection mode is the term for table mode in DynamoDB workspaces.
DynamoDB only supports De-Identify, Truncate, and Preserve Destination modes.
DynamoDB workspaces cannot use the following generators:
Algebraic
Alphanumeric Key
Array Character Scramble
Array JSON Mask
Array Regex Mask
Cross-Table Sum
CSV Mask
Event Timestamps
HTML Mask
JSON Mask
SIN
Timestamp Shift
URL
DynamoDB workspaces only support self-consistency.
You cannot make a DynamoDB field consistent with another field.
DynamoDB workspaces do not support subsetting.
DynamoDB workspaces do not support upsert.
For DynamoDB workspaces, you cannot write the destination data to a container repository.
For DynamoDB workspaces, you cannot write the destination data to an Ephemeral snapshot.
For DynamoDB workspaces, there is no option to run post-job scripts after a job.
You can create webhooks that are triggered by data generation jobs.
During workspace creation, under Connection Type, click DynamoDB.
For the source database, you can use either:
A local DynamoDB instance. If you use a local instance, it must be publicly accessible.
The cloud instance.
To connect to a local instance:
Toggle Use Local DynamoDB Instance to the on position.
In the Server field, provide the path to the server that hosts DynamoDB.
In the Port field, specify the port to use to connect to DynamoDB.
To test the connection to the source database, click Test Source Connection.
When you connect to the cloud instance, Structural prompts you to provide the access credentials.
You can either provide the credentials, or use an assumed role.
On a self-hosted instance, if you do not provide credentials in the workspace configuration, then Structural uses either:
The credentials for the IAM role on the host machine.
The credentials set in the following environment settings:
TONIC_AWS_ACCESS_KEY_ID
- An AWS access key that is associated with an IAM user or role.
TONIC_AWS_SECRET_ACCESS_KEY
- The secret key that is associated with the access key.
TONIC_AWS_REGION
- The AWS Region to send the authentication request to.
To connect to the cloud instance:
Toggle Use Local DynamoDB Instance to the off position.
To use an assumed role:
Click Assume Role.
In the Role ARN field, provide the Amazon Resource Name (ARN) for the role.
In the Session Name field, provide the role session name.
If you do not provide a session name, then Structural automatically generates a default unique value. The generated value begins with TonicStructural
.
In the Duration (in seconds) field, provide the maximum length in seconds of the session.
The default is 3600
, indicating that the session can be active for up to 1 hour.
The provided value must be less than the maximum session duration that is allowed for the role.
To provide user credentials:
Click User Credentials.
In the AWS Access Key field, enter the AWS access key that is associated with an IAM user or role.
In the AWS Secret Key field, enter the secret key that is associated with the access key.
In the AWS Session Token field, you can optionally provide a session token for a temporary set of credentials.
From the AWS Region dropdown list, select the AWS Region to send the authentication request to.
To test the connection to the source database, click Test Source Connection.
By default, data generation is not blocked as long as schema changes do not conflict with your workspace configuration.
To block data generation when there are any schema changes, regardless of whether they conflict with your workspace configuration, toggle Block data generation on schema changes to the on position.
For the destination database, you can use either:
A local DynamoDB instance. If you use a local instance, it must be publicly accessible.
The cloud instance.
The destination database:
Cannot be on the same local instance as the source database.
Cannot be tied to the same AWS account as the source database.
To connect to a local instance:
Toggle Use Local DynamoDB Instance to the on position.
In the Server field, provide the path to the server that hosts DynamoDB.
In the Port field, specify the port to use to connect to DynamoDB.
To test the connection to the source database, click Test Source Connection.
When you connect to the cloud instance, Structural prompts you to provide the access credentials.
You can either provide the credentials, or use an assumed role.
On a self-hosted instance, if you do not provide credentials in the workspace configuration, then Structural uses either:
The credentials for the IAM role on the host machine.
The credentials set in the following environment settings:
TONIC_AWS_ACCESS_KEY_ID
- An AWS access key that is associated with an IAM user or role.
TONIC_AWS_SECRET_ACCESS_KEY
- The secret key that is associated with the access key.
TONIC_AWS_REGION
- The AWS Region to send the authentication request to.
To connect to a cloud instance:
Toggle Use Local DynamoDB Instance to the off position.
To use an assumed role:
Click Assume Role.
In the Role ARN field, provide the Amazon Resource Name (ARN) for the role.
In the Session Name field, provide the role session name.
If you do not provide a session name, then Structural automatically generates a default unique value. The generated value begins with TonicStructural
.
In the Duration (in seconds) field, provide the maximum length in seconds of the session.
The default is 3600
, indicating that the session can be active for up to 1 hour.
The provided value must be less than the maximum session duration that is allowed for the role.
To provide user credentials:
Click User Credentials.
In the AWS Access Key field, enter the AWS access key that is associated with an IAM user or role.
In the AWS Secret Key field, enter the secret key that is associated with the access key.
In the AWS Session Token field, you can optionally provide a session token for a temporary set of credentials.
From the AWS Region dropdown list, select the AWS Region to send the authentication request to.
To test the connection to the source database, click Test Destination Connection.
For all source tables, you must enable point-in-time recovery.
If point-in-time recovery is not enabled for every source table, then data generation fails.
Structural automatically enables point-in-time recovery when it creates the destination tables.
For each source and destination database, you must set up a corresponding S3 bucket. Structural uses the S3 buckets for temporary storage.
You must use a separate S3 bucket for each database. You cannot use the same S3 bucket for both the source and destination database.
The source database user must have access to the source S3 bucket.
The destination database user must have access to the destination S3 bucket.
Structural does delete files from the S3 buckets after each job. However, to ensure that files do not accumulate, set up lifecycle rules to empty the S3 buckets.
The source database user must have permissions for DynamoDB and Amazon S3. We recommend that you create a custom policy for each service.
Note that the required Amazon S3 permissions include access to the S3 bucket that you created for the source database.
Here is an example of a DynamoDB policy that grants the required permissions for the source database user:
For Amazon S3, the source database user requires access to the S3 bucket that you created for the source database.
Here is an example of an Amazon S3 policy that grants the required permissions for the source database user:
Here is an example of a DynamoDB policy that grants the required permissions for the destination database user:
For Amazon S3, the destination database user requires access to the S3 bucket that you created for the destination database.
Here is an example of an Amazon S3 policy that grants the required permissions for the destination database user:
For Amazon S3, the destination database user requires access to Amazon CloudWatch.
Here is an example of an Amazon CloudWatch policy that grants the required permissions for the destination database user: