Provided by:
Task: Crawling

Xbots is an web spider and classifier that finds information according to user's specifications/concepts. Xbots concepts can be trained and then used by the spider both to select relevant sites and to determine which way to crawl next. Search concepts can be created in a wide variety of ways. One can start with just giving some examples of relevant web sites, but it is also possible to begin with more structured information or encyclopaedic information like a wikipedia page. For example, concepts like 'hotel' or 'restaurant' can be trained in this way, enabling the user to retrieve large amounts of sites in these categories. When the spiders conclude their searches the web sites found can be sorted according to their concept relevance, and can be examined from the application’s GUI. Furthermore, contact addresses can be extracted from the sites using the WebID plug-in. The information on these sites is extracted and stored in a structured way, e.g. in a database or in XML.

Get Started with the service

: Contact the Provider