Import.io – Turn the Web Into Data With Extractors, Crawlers and Connectors
Using their desktop app you can turn any website into a table of data or an API in just a few minutes without writing any code. There are three methods of data extraction: a) Extractors – An Extractor is the most straightforward method of getting data, it allows you to turn an unstructured web page into a table of data; b) Crawlers – A Crawler is an extractor with legs; it allows you to go to every page of a website and extract data from every page that matches the pattern you define; and c) Connectors – A Connector is an extractor that lets you interact with a web site, for example via a search field, to get to your data. With this data you can: 1) Create – Datasets allow you to mash together up to 100 live and static data sources; 2) Share – Show off your Dataset by sharing it via social media or a unique link; 3) Export – Download your Dataset as JSON, CSV, HTML, or XLS; 4) Integrate – Using their client libraries you can easily integrate live data in any language; 5) Mix – Combine multiple Connectors to create your own aggregated search; and 6) Store – All your data is stored on their cloud servers so you can access it anywhere. This will be added to Bot Research Subject Tracer™. This will be added to my white paper Web Data Extractors. This has been added to the tools section of Research Resources Subject Tracer™ Information Blog.