nicerobot

About nicerobot

This author has not yet filled in any details.
So far nicerobot has created 19 blog entries.

Email And Social Media Links Crawling From Websites

At Web Robots we often get inquiries on projects to crawl social media links and emails from specific list of small websites. Such data is sought after by growth hackers and sales people for lead generation purposes. In this blog post we show an example robot which does exactly that and anyone can run such web scraping project using Web Robots Chrome extension on their own computer.

To start you will need account on Web Robots portal, Chrome extension and thats it. We placed a robot called leads_crawler in our portal’s Demo space so anyone can use it. In case robot’s code is changed below is complete source code for this robot. You […]

By | 2017-03-14T13:20:59+00:00 March 2nd, 2017|Uncategorized|0 Comments

Scraping Extension Update – version 2017.2.23

Recently we rolled out an updated version of our main web scraping extension which contains several important updates and new features. This update allows our users to develop and debug robots even faster than before. So what exactly is new?

  1. jQuery has been upgraded from version 1.10.2 to 2.2.4
  2. done() now can take a milliseconds delay parameter. For example done(1000); will delay step finish by 1 second.
  3. New tab Selectors which allows testing selectors inline and generates robot code. Selectors are immediately tested on browser’s active tab so developer can see if they work correctly. Copy code button copies Javascript code to clipboard which can be pasted directly into robot’s step.
  4. Output tab […]
By | 2017-02-24T15:59:11+00:00 February 24th, 2017|Uncategorized|0 Comments

Writing Better Data Collection Robots

At Web Robots we have a fanatical customer support. Large part of this is doing technical support for robot developers. For this we maintain live chatrooms, often do screenshares, joint code writing sessions with each of our customers’ development teams. This helps us solve most of the problems out customers encounter in minutes, difficult ones in several hours. This is not an exaggeration.

Writing web scraping robot

Based on the accumulated experience we were able to identify a list of the most common mistakes that robot writers can make. We published a list with specific code examples that illustrate the mistakes. Then each mistake has an explanation and a code example with solution. This list is highly recommended read for new to […]

By | 2017-01-09T16:20:30+00:00 November 10th, 2016|Uncategorized|0 Comments

Migrating PostgreSQL Databases From AWS RDS To Standalone

Intro

AWS RDS is very convenient and takes care of almost all DBA tasks. It just works as long as you stay inside AWS. But if you want to have a local copy of your database or need to move data to another host it can be tricky.

TL;DR – For our solution skip to part last section

What doesn’t work

AWS daily backups

AWS RDS by default creates daily backups of your data. First thought would be get such backup and restore it locally. But those are not regular Postgres backups. They probably are VM image copies, but no way to know, as you cannot copy or see them.  The only option is to restore them to another RDS instance.

Postgresql replication

AWS uses replication to maintain […]

By | 2017-01-09T16:20:30+00:00 October 14th, 2016|Uncategorized|0 Comments

Announcement: Robot Naming Change

Recently we started enforcing that robot names can have only alphanumeric, underscore (_) and dash (-) characters and must be at least 3 characters long. The reason for this move is that robot names are used in generating run_id and later file names. Some nontypical characters in robot names were causing problems when processing files using various ETL tools, storing in file systems. All existing robots were modified by replacing non-compliant characters with underscores.

-Web Robots Team

By | 2016-09-29T11:45:47+00:00 September 29th, 2016|Uncategorized|0 Comments

Scrape Instagram Followers

Our platform is often used by growth hackers for lead generation in social media networks. One such use case is building a list of Instagram followers from interestingprofiles. Today we placed one such robot into our portal‘s demo space for anyone to use. Robot is only 30 lines of Javascript code and works quite fast. We tested it with IBM’s Instagram which has 78k followers and it took only 14 minutes to scrape them.

instagram_robot

How to use this robot:

  1. Login to Web Robots portal on Chrome browser.
  2. Make sure you have Web Robots Chrome extension to run the robot.
  3. Open robot instagram_followers in our extension.
  4. Make sure you are logged in on […]
By | 2017-03-27T16:25:50+00:00 June 8th, 2016|Uncategorized|11 Comments

Scraping Yelp Data

We get a lot of requests to scrape data from Yelp. These requests come in on a daily basis, sometimes several times a day. At the same time we have not seen a good business case for a commercial project with scraping Yelp.

We have decided to release a simple example Yelp robot which anyone can run on Chrome inside your computer, tune to your own requirements and collect some data. With this robot you can save business contact information like address, postal code, telephone numbers, website addresses etc.  Robot is placed in our Demo space on Web Robots portal for anyone to use, just sign up, find the robot and use it.

<img class="size-large wp-image-5328 aligncenter" src="https://webrobots.io/wp-content/uploads/2016/03/Screen-Shot-2016-03-01-at-3.22.41-PM-1024x849.png" alt="Screen Shot 2016-03-01 at 3.22.41 PM" width="669" [...]

By | 2017-01-09T16:20:31+00:00 March 1st, 2016|Uncategorized|0 Comments

New Dataset – UK LPA Search

We are excited to announce UK LPA Search – it is a search engine for all UK’s local planning authorities. Until now there was no possibility to search LPA databases from one place. One had to find each LPA’s website and search inside it. Considering there are few hundred of them – this would not be an easy task for a human. Our robots have no problems indexing all databases and providing them as a single dataset.

A bonus point – we geocoded all requests and display them on a map. Therefore anyone can see what building permits are being issues around them. Example: Map of building permits in London

By | 2017-01-09T16:20:31+00:00 January 27th, 2016|Uncategorized|0 Comments

New Kickstarter Dataset

Recently we updated our Kickstarter robot to crawl project subcategories. This allows us to collect a richer dataset, for example on 2015-12-17 run robot collected data about 144,263 projects with a running time only 2 hours! We also started presenting it in the JSON streaming format which is just a line delimited JSON. Previously we used to stuff all projects into JSON array and the downside of it was that user would have to read the entire large JSON file into memory before any kind of processing starts. with JSON streaming it is possible to read one line at a time.

Data is posted in the usual place.

By | 2015-12-31T10:16:20+00:00 December 31st, 2015|Uncategorized|0 Comments

New Features

We are happy to announce some new features in our robot writing framework. These features are:

  • Fork() – split robot into many parallel robots and run them simultaneously. This feature shortens long scraping jobs by parallelising them. Cloud autoscaling handles necessary instance capacity so our customers can run 100s of instances on-demand.
  • skipVisited – allows robot to intelligently skip steps to links that were already visited. Avoid data duplication, save robot running time.
  • respectRobotsTxt – crawl target sources with compliance to their robots.txt file.

These features are explained in detail and examples added to our framework documentation page.

By | 2017-01-09T17:47:50+00:00 December 3rd, 2015|Uncategorized|0 Comments