Browse AI has a Workflows feature that enables you to extract a list of items from one robot, then automatically run a second robot that extracts details from each of those items.
Learn more in How can I create a workflow connecting two robots?
If you want to chain more than two robots
To chain more than two robots, you can create multiple workflows and link them together. Here's how you could connect 3 robots.
Workflow 1: run robot A, which triggers robot B
Connect Robot A (your initial data source) to Robot B. That means you could extract a list of items, then get details from each of those items (which may include other links you'd like to visit and scrape).
᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌᠌ ᠌
Workflow 2: run robot B, which triggers robot C
Connect Robot B (which now has the output from Robot A) to Robot C.
This creates a sequential flow of data from Robot A to Robot B to Robot C, effectively chaining all three robots.
You can extend this approach to connect even more robots by adding additional workflows in a similar fashion.
Chaining multiple workflows in Browse AI unlocks advanced automation possibilities by enabling you to connect more than two robots to perform a sequence of tasks. This technique allows you to create intricate, multi-layered scraping processes that go beyond the capabilities of simple two-robot chains.
Imagine you need to extract data from a website with a complex structure:
Robot A: Scrapes a list of categories from the homepage.
Robot B: For each category, it extracts a list of product pages.
Robot C: Visits each product page to extract detailed information like price, description, and reviews.
And so on...
Automating the entire process using a Monitor
You can automate these entire workflows by adding a monitor to the very first robot (Robot A in our example). This monitor will periodically trigger the robot to run, initiating the entire chain of robots and ensuring your data is always up-to-date.
Here's a quick breakdown:
Configure the monitor on Robot A to run at your desired frequency (e.g., daily, hourly, weekly, etc.)
2. Once the monitor is scheduled to run, it triggers Robot A to start extracting data.
3. It would then cascade the data by having Robot A pass the extracted data to Robot B, which then triggers Robot C, and so on. The entire workflow executes automatically.
This setup allows you to create a fully automated data extraction pipeline that continuously gathers and processes information without any manual intervention.