Why reuse robots?
When websites maintain consistent layouts across different sections (like product categories, blog archives, or directory listings), you can use a single robot to extract data from all of them. This saves you time and eliminates the need to build and maintain multiple robots for the same website.
QUICK TIP: Browse AI robots are designed to be flexible. When dealing with a website with consistent structures across different categories or sections (e.g., product pages, blog posts, directory listings), you can often use a single robot to extract data from all of them. This saves you time and effort, streamlining your data extraction process.
Three ways to reuse your robot
Method 1: Change the origin URL for a single run
Best for: quick, one-time extraction from a different page with the same structure.
Open your robot and go to the Run task tab
In the Origin URL field, paste the new category or section URL
Choose whether to keep or update the default URL:
Leave Save as default unchecked for a one-time change
Select Save as default to permanently update the robot's target URL
Click Run task
Method 2: Use Bulk Run for multiple pages at once
Best for: extracting data from multiple pages with the same structure at once.
Open your robot and go to the Run Task tab
Click the Bulk run tasks button
Download the sample CSV file (this file provides a template for how to structure your input data, including URLs and any other variables)
Add the URLs of all sections you want to scrape to the CSV file
Upload your modified CSV
Select the Origin URL column header to map your URLs
Optionally, select your preferred integration for the output
Start the bulk run task
Method 3: Create monitors with different origin URLs
Best for: ongoing extraction and monitoring of specific categories or sections of a specific page on a schedule.
Create a new monitor for each section or category you want to track
Specify the corresponding origin URL for each monitor
Configure your desired monitoring frequency
Click Save