Skip to main content
All CollectionsGetting startedCore concepts
What are robots and what can they do?
What are robots and what can they do?

Robots are your automated assistants for extracting and monitoring web data. They perform tasks you'd typically do manually - from basic data extraction to complex automation workflows - all without requiring any coding knowledge.

Nick Simard avatar
Written by Nick Simard
Updated yesterday

Core capabilities

Extracting web data

Robots can capture both structured and unstructured data from any website. They extract:

Navigating websites

Our robots handle complex website interactions including:

Monitoring for changes

Keep track of important website updates with robots that:

  • Monitor specific pages or data points

  • Send notifications when changes occur

  • Track historical data changes

  • Support multiple monitoring configurations per robot

Bulk operations

Scale your data extraction by:

  • Running multiple tasks simultaneously

  • Processing 50,000 URLs at once

  • Extracting data from similar page layouts

  • Maintaining consistent data quality at scale

Advanced features

Smart adaptation

Our robots automatically:

  • Adapt to website changes

  • Use residential IP addresses for reliable access

  • Emulate human behavior to avoid detection

  • Handle rate limiting and proxy management

Security and reliability

Every robot includes:

  • Secure credential handling

  • Automated error recovery

  • Built-in retry mechanisms

  • Performance optimization

Working with your data

Spreadsheet automation

Turn any website into a live spreadsheet:

Developer tools

Build web data into your products and projects:

Integration platforms

Connect your robots to over 7,000 applications through our integrations including:


​​

Did this answer your question?