Skip to main content

How to export your data to AWS S3

Export your Browse AI data directly to Amazon S3 for secure cloud storage, automated processing, and integration with your AWS workflows. Perfect for large datasets and enterprise data pipelines.

M
Written by Melissa Shires
Updated yesterday

S3 exports are ideal for:

  • Large datasets: no file size limits or download issues.

  • Automated workflows: process data with AWS services.

  • Data archiving: secure, cost-effective long-term storage.

  • Team collaboration: centralized data access.

  • Integration pipelines: connect to data warehouses.

  • Scheduled backups: automatic regular exports.

Prerequisites

Before exporting to S3, you need:

  1. βœ“ AWS account with S3 bucket

  2. βœ“ Data in your robot's Tables

Manual export to S3

One-time export process

  1. Navigate to your Table and configure your view based on the data you want to extract (filters, columns, and historical data preference).

  2. Click Export

  3. Select S3 export option and choose either:

    • Export as JSON to S3

    • Export as CSV to S3

  4. Click the export button

πŸ’‘ After you click Export the data transfers directly to your bucket with no local download needed. Progress is displayed directly in Tables.

Understanding S3 file structure

How Browse AI organizes your exports

Every export creates a unique directory in your S3 bucket.

your-bucket/
└── export_20250415T175912Z_32ffd473-4476-42bc-96bb/
β”œβ”€β”€ Main.csv
β”œβ”€β”€ Products.csv
└── Reviews.csv

Directory naming pattern

  • Timestamp: ISO 8601 format (sortable)

  • Unique ID: Prevents any conflicts

  • Optional prefix: If you specified a folder path

export_{timestamp}_{unique-id}/

File naming conventions

Small files (< 100MB):

Main.csv
Products.json
Reviews.csv

Large files (automatically chunked):

Main.part1.json
Main.part2.json
Main.part3.json

Individual records (if configured):

0005541b-d61d-42d7-ba24-17600e0068e5.json
0006642c-e72e-53e8-cb35-28711f1179f6.json

Scheduled exports to S3

Set up scheduled exports to automatically create a live data pipeline.

  1. Navigate to your robot's dashboard.

  2. Click the Tables tab.

  3. Find your configured AWS S3 integration.

  4. Click the three-dot menu (β‹―) next to the Export button.

  5. Select "Create scheduled export" from the dropdown.

πŸ“– Learn more about how to set up scheduled exports to S3.

Accessing your files

AWS Console

  1. Navigate to your S3 bucket

  2. Browse to the export directory

  3. Download or process as needed

AWS CLI

aws s3 ls s3://your-bucket/export_20250415T175912Z_32ffd473/
aws s3 cp s3://your-bucket/export_20250415T175912Z_32ffd473/ . --recursive

SDK (Python)

import boto3 s3 = boto3.client('s3') s3.download_file('your-bucket', 'export_path/Main.csv', 'local_file.csv')
Did this answer your question?