Web Robots scraping platform has integrated proxy support with capabilities for proxy rotation, multiple proxy pools, geographic targeting, residential proxies, centralized proxy authentication. This feature is easy to use and has a lot flexibility for proxy use strategies. This feature is available only to paid customers, it is not available for Demo account users.
Demo users can still use proxies, but they have to specify proxy settings manually, as seen in our documentation for setProxy()

How it works

Initial proxy pools for a particular client’s environment have to to be setup by Web Robots admins. Number of pools and number of IP addresses in a pool are not limited. Usually proxy pools are named by proxy vendor or geographic regions, so each pool serves a specific purpose. Once proxy pools are available any robot can start using a proxy just by having setProxy(pool_name).

steps.start = function(){
    setProxy('pool_name');
    next('','scrape');
    done();
}

This command can be repeated during robot execution to change proxy IP address at any given time. When command is called Web Robots backend picks the most rested proxy from the pool. This way robot can have a fine control when to hop proxy IP address. For example on every page, every 100 pages, every new product category.

steps.start = function(){
    
    setProxy('pool_name');
    $('.product').each((i, v) => {
        if (i % 100 === 0) next('', 'rotateProxy');
        next(v.href, 'scrape');
    }
    done();
}

steps.rotateProxy = function(){
    setProxy('pool_name');
    done();
}

steps.scrape = function(){
    // some code   
    done();
}

Residential Proxy Support

Residential proxy pool can be setup and used just like a normal proxy pool. It will have thousands of exit nodes and exit nodes will be rotated on every request. It is possible to setup residential proxies bound to certain geographic locations. Robot must have only one setProxy(residential_pool_name) invocation to start using it, there is no need to call it repeatedly. When using residential proxies:

  1. Optimize robot for lower network traffic (use fastnext() where possible, etc).
  2. Increase number of retries as residential proxies can fail a small percentage of requests, they need to be repeated. Recommended value is 5-7 retries. Also allow larger number of failed steps.