Overview
By default, Tornado uploads files to our managed storage. You can configure your own cloud storage to receive downloads directly.Tornado supports 4 major cloud providers, giving you flexibility to use the storage solution that best fits your infrastructure.
Supported Providers
AWS S3 / S3-Compatible
AWS S3, Cloudflare R2, MinIO, DigitalOcean Spaces, Backblaze B2, Wasabi, OVH
Azure Blob Storage
Azure Storage Accounts with Blob containers
Google Cloud Storage
GCS buckets with service account authentication
Alibaba OSS
Alibaba Cloud Object Storage Service
| Provider | Configure | Remove |
|---|---|---|
| S3 / S3-Compatible | POST /user/s3 | DELETE /user/s3 |
| Azure Blob | POST /user/blob | DELETE /user/blob |
| Google Cloud Storage | POST /user/gcs | DELETE /user/gcs |
| Alibaba OSS | POST /user/oss | DELETE /user/oss |
S3 / S3-Compatible Storage
Works with AWS S3 and any S3-compatible provider.Supported S3 Providers
| Provider | Endpoint Format |
|---|---|
| AWS S3 | https://s3.{region}.amazonaws.com |
| Cloudflare R2 | https://{account_id}.r2.cloudflarestorage.com |
| DigitalOcean Spaces | https://{region}.digitaloceanspaces.com |
| Backblaze B2 | https://s3.{region}.backblazeb2.com |
| Wasabi | https://s3.{region}.wasabisys.com |
| MinIO | https://your-minio-server.com |
| OVH Object Storage | https://s3.{region}.cloud.ovh.net |
Configure S3 Storage
Cloudflare R2 Setup
Cloudflare R2 Setup
Required S3 Permissions
Azure Blob Storage
Use Azure Storage Accounts with Blob containers.Configure Azure Blob
Azure Setup Guide
Azure Setup Guide
Create Storage Account
In Azure Portal, go to Storage Accounts > Create.
- Choose Standard performance
- Select Hot access tier
- Enable Blob public access if needed for direct URLs
Create Container
In your Storage Account, go to Containers > + Container.Name it (e.g.,
tornado-downloads).Alternative: SAS Token
You can use a SAS token instead of the account key for more granular permissions:Required Azure Permissions
When using a SAS token, ensure these permissions are enabled:- Read (r) - For generating download URLs
- Write (w) - For uploading files
- Delete (d) - For cleanup operations
- List (l) - For validation
Google Cloud Storage
Use GCS buckets with service account authentication.Configure GCS
GCS Setup Guide
GCS Setup Guide
Create GCS Bucket
In Google Cloud Console, go to Cloud Storage > Create Bucket.
- Choose a unique name
- Select your preferred region
- Choose Standard storage class
Create Service Account
Go to IAM & Admin > Service Accounts > Create Service Account.Name it (e.g.,
tornado-storage).Download JSON Key
In the service account details, go to Keys > Add Key > Create new key > JSON.Download and save the JSON file.
Required GCS Permissions
The service account needs the Storage Object Admin role, which includes:storage.objects.createstorage.objects.deletestorage.objects.getstorage.objects.list
Alibaba Cloud OSS
Alibaba OSS uses its own endpoint and credential format.Configure Alibaba OSS
OSS Endpoint Regions
| Region | Endpoint |
|---|---|
| China (Hangzhou) | https://oss-cn-hangzhou.aliyuncs.com |
| China (Shanghai) | https://oss-cn-shanghai.aliyuncs.com |
| China (Beijing) | https://oss-cn-beijing.aliyuncs.com |
| Singapore | https://oss-ap-southeast-1.aliyuncs.com |
| US West | https://oss-us-west-1.aliyuncs.com |
| Germany | https://oss-eu-central-1.aliyuncs.com |
Folder Prefix
All providers support an optionalfolder_prefix to organize your downloads:
The folder prefix is placed inside the base folder (
videos/ by default) and combined with any folder parameter you specify in individual job requests.Base Folder
All providers support an optionalbase_folder parameter to change the top-level folder where files are organized. By default, files are placed inside a videos/ folder.
Examples
Default behavior (nobase_folder specified):
base_folder:
folder_prefix:
If you don’t specify
base_folder, it defaults to videos for backward compatibility. The base_folder is always the top-level folder, with folder_prefix nested inside it.Presigned URLs
When you poll job status, thes3_url field contains a presigned/signed URL for your bucket:
| Provider | URL Format | Validity |
|---|---|---|
| S3/R2 | AWS Signature V4 | 24 hours |
| Azure Blob | SAS URL | 24 hours |
| GCS | Signed URL V4 | 24 hours |
| Alibaba OSS | OSS Signature | 24 hours |
Legacy Endpoint (S3 Only)
The/user/bucket endpoint still works for S3-compatible storage only:
Reset to Default Storage
To switch back to Tornado’s managed storage, use the DELETE endpoint for your provider:After removing, all new downloads will use Tornado’s managed storage. Existing files in your custom storage remain untouched.
Troubleshooting
Common Errors
| Error | Cause | Solution |
|---|---|---|
Credential validation failed: Access Denied | Invalid credentials | Verify access key/secret/token |
Credential validation failed: NoSuchBucket | Bucket/container doesn’t exist | Create the bucket first |
Credential validation failed: timeout | Endpoint unreachable | Check endpoint URL and network |
Invalid service account JSON | Malformed GCS credentials | Validate JSON format |
Account key must be valid Base64 | Azure key format error | Copy the full key from Azure Portal |
Testing Your Configuration
After configuring storage, create a test job to verify everything works:s3_url, your storage is configured correctly.
Inline Storage (Per-Request)
For marketplace users or one-off configurations, you can provide storage credentials directly in the job request:Inline storage credentials:
- Take priority over pre-configured storage
- Are validated before the job is accepted
- Are never stored or logged
- Support all providers (S3, Azure Blob, GCS, OSS)
- Support
folder_prefixandbase_folderparameters
For API marketplace users (RapidAPI, Apify, Zyla), inline storage credentials are required for every request.
See the Marketplace Integration guide for details.
Security Best Practices
Use Least Privilege
Use Least Privilege
Create dedicated credentials with only the permissions needed:
- S3: Custom IAM policy with specific bucket access
- Azure: SAS token with limited scope
- GCS: Service account with only Storage Object Admin on specific bucket
Rotate Credentials Regularly
Rotate Credentials Regularly
Set up credential rotation:
- AWS: Use IAM Access Analyzer
- Azure: Set SAS token expiration
- GCS: Rotate service account keys
Enable Bucket Logging
Enable Bucket Logging
Monitor access to your storage:
- S3: Enable Server Access Logging
- Azure: Enable Storage Analytics
- GCS: Enable Cloud Audit Logs
