S3 exports are ideal for:
Large datasets: no file size limits or download issues.
Automated workflows: process data with AWS services.
Data archiving: secure, cost-effective long-term storage.
Team collaboration: centralized data access.
Integration pipelines: connect to data warehouses.
Scheduled backups: automatic regular exports.
Prerequisites
Before exporting to S3, you need:
β AWS account with S3 bucket
β Data in your robot's Tables
Manual export to S3
One-time export process
Navigate to your Table and configure your view based on the data you want to extract (filters, columns, and historical data preference).
Click Export
Select S3 export option and choose either:
Export as JSON to S3
Export as CSV to S3
Click the export button
π‘ After you click Export the data transfers directly to your bucket with no local download needed. Progress is displayed directly in Tables.
Understanding S3 file structure
How Browse AI organizes your exports
Every export creates a unique directory in your S3 bucket.
your-bucket/
βββ export_20250415T175912Z_32ffd473-4476-42bc-96bb/
βββ Main.csv
βββ Products.csv
βββ Reviews.csv
Directory naming pattern
Timestamp: ISO 8601 format (sortable)
Unique ID: Prevents any conflicts
Optional prefix: If you specified a folder path
export_{timestamp}_{unique-id}/File naming conventions
Small files (< 100MB):
Main.csv
Products.json
Reviews.csv
Large files (automatically chunked):
Main.part1.json
Main.part2.json
Main.part3.json
Individual records (if configured):
0005541b-d61d-42d7-ba24-17600e0068e5.json
0006642c-e72e-53e8-cb35-28711f1179f6.json
Scheduled exports to S3
Set up scheduled exports to automatically create a live data pipeline.
Navigate to your robot's dashboard.
Click the Tables tab.
Find your configured AWS S3 integration.
Click the three-dot menu (β―) next to the Export button.
Select "Create scheduled export" from the dropdown.
π Learn more about how to set up scheduled exports to S3.
Accessing your files
AWS Console
Navigate to your S3 bucket
Browse to the export directory
Download or process as needed
AWS CLI
aws s3 ls s3://your-bucket/export_20250415T175912Z_32ffd473/
aws s3 cp s3://your-bucket/export_20250415T175912Z_32ffd473/ . --recursive
SDK (Python)
import boto3 s3 = boto3.client('s3') s3.download_file('your-bucket', 'export_path/Main.csv', 'local_file.csv')