When you sign up for a Tonic Textual account, you can immediately get started with a new pipeline.
Note that these instructions are for setting up a new account on Textual Cloud. For a self-hosted instance, depending on how it is set up, you might either create an account manually or use single sign-on (SSO).
To get started with a new Textual account:
Go to https://textual.tonic.ai/.
Click Sign up.
Enter your email address.
Create and confirm a password for your Textual account.
Click Sign Up.
Textual creates your account, and prompts you to provide some additional information about yourself and how you plan to use Textual.
After you fill out the information and click Get Started, Textual displays the Textual Home page.
It also displays a panel that prompts you to create a Textual pipeline. Textual pipelines extract text from files and transform it into content that you can use to populate a RAG system or LLM. For information about this workflow, go to Pipelines workflow for LLM preparation.
From the panel, you can create a demo pipeline that uses example data, or you can create a pipeline that uses your own data.
Textual provides example pipeline data that consists of files of different types. To create a pipeline that uses the example data, click Try Demo Pipeline.
The pipeline is named Sample Pipeline. From the pipeline details, you can view the details for the example files and add more files.
From the Getting Started panel, to create a pipeline that uses your own data:
Click Create a Pipeline.
On the Create A New Pipeline panel, provide the pipeline name.
Select the source option for the files:
Files uploaded from a local filesystem
Amazon S3
Databricks
Azure Blob Storage
To upload the files, click File Upload, then click Save.
For information about managing files for an uploaded file pipeline, go to Selecting files for an uploaded file pipeline.
To have the files come from Amazon S3:
Click Amazon S3.
In the Access Key field, provide an AWS access key that is associated with an IAM user or role.
In the Access Secret field, provide the secret key that is associated with the access key.
From the Region dropdown list, select the AWS Region to send the authentication request to.
To test the credentials, click Test AWS Connection.
Click Save.
For information about configuring an Amazon S3 pipeline, go to Configuring an Amazon S3 pipeline.
To have the files come from a Databricks volume:
Click Databricks.
In the Databricks URL field, provide the URL to the databricks data volume.
In the Access Token field, provide the access token to use to get access to the volume.
To test the connection, click Test Databricks Connection.
Click Save.
For information about configuring a Databricks pipeline, go to Configuring a Databricks pipeline.
To have the files from Azure Blob Storage:
Click Azure.
In the Account Name field, provide the name of your Azure account.
In the Account Key field, provide the access key for your Azure account.
To test the connection, click Test Azure Connection.
Click Save.
After you set up an account on Textual Cloud, you start a Textual free trial, during which Textual scans up to 100,000 words for free. Note that Textual counts actual words, not tokens. For example, "Hello, my name is John Smith." counts as six words.
After the 100,000 words, Textual disables scanning for your account. Until you purchase a pay-as-you-go subscription, you cannot:
Add files to a dataset or pipeline
Run a pipeline
Use the Playground
During your free trial, Textual displays the current usage in the following locations:
On the Home page
In the navigation menu
On the Playground
Textual also prompts you to purchase a pay-as-you-go subscription, which allows an unlimited number of words scanned for a flat rate per 1,000 words.