β You will need an approved robot in order to automatically extract data from a list of input parameters, or URLs.
π‘ Need to process more than 50,000 URLs? Use the dedicated Bulk Run feature which supports up to 500,000 URLs.
In Tables you can upload a list of input parameters or URLs to automatically extract data from thousands of web pages.
For example:
Scraping large numbers of product detail pages.
Extracting data from multiple search results.
Performing deep scraping across many pages.
Batch processing with different parameters.
How to import a CSV into Tables
Format your CSV file for upload
Your CSV must have column headers matching your robot's input parameters exactly.
This could include:
originUrl column containing full URLs (including https://)
Additional columns for any other parameters your robot needs
Make sure to include:
Include a header row with column names.
Make sure URLs include the full address (including http:// or https://).
Avoid special characters in your CSV.
Ensure the file is properly encoded (UTF-8 recommended).
π‘ You can download a sample file directly from Tables if you're unsure of the correct format. Go to Tables, and select Import CSV. From there you can download a sample CSV to use as a template.
Import your file to run your bulk task
In your robot's Table, click Import CSV.
Click Click to upload in the sidebar and select your CSV file.
Verify the column mapping is correct and click Start Bulk Run.
Review your Bulk Run progress and results
Your imported tasks will:
The run will appear in your Table with a unique Bulk Run identifier.
Tasks will process sequentially.
You can filter your Table by the Bulk Run name to see only these results.
Progress indicators will show how many tasks are complete vs. pending.
π‘ Depending on the complexity of your robot and the number of tasks you've triggered, it may take a few minutes for your robot to process the full list of tasks.
Deep scraping with CSV import
A common workflow is using one robot's output as another robot's input. For example extracting a list of product details from a category page.
Robot A extracts product URLs from category pages.
Robot B is trained to extract data from a single product page.
To create a dataset of all product pages from the category pages you would:
Export Robot A's URLs as CSV.
Import to Robot B to automatically extract details from all pages.
π‘ You can also do this in the product using our workflows feature.
How to deep scrape with Tables
β To use this feature, you need to have a trained robot for:
Robot A: trained to extract a list of URLs or input parameters
Robot B: trained to extract details on a page using a URL or input parameter.
Export URLs from Robot A's Table
Rename columns to match Robot B's parameters
Keep only the columns Robot B needs
Import to Robot B (max 50,000 rows)
Common use cases
E-commerce product monitoring
Import URLs of multiple product pages to extract pricing, features, and availability across competitors.
Real estate property data
After scraping property listings from search results, use Bulk Run to get detailed information about each property.
Content aggregation
Import URLs of articles or blog posts to extract and compile content from multiple sources.
Lead generation
Scrape contact details from a list of company pages or professional profiles.


