The Internet of Bots uses a botrank to score the different bots it monitors. Every bot can earn up to 50 checkpoints and all the checkpoints are multiplied by 2 in order to convert to a percentage. A bot with a 40 checkpoints will have a botrank of (40*2=) 80%. These checkpoints evaluate the transparency and occurrence of a bot and don’t necessarily say something about its quality! The checkpoints are furthermore divided into two main sections. The first section looks at the information and transparency regarding the bot and its host and has a total of 20 points, divided between the categories User Agent(s), Whois, Weblinks and Usage. The second section looks at how often a bot visits and how many domains it actually finds. This category has a total of 30 points.

User Agent(s)

The category User Agent(s) looks at five different variables:

  1. Distinguishable - Is a bot easily to recognize by its user agent:

    Example: Wotbox/2.01 (+

  2. Botname - Does the bot have a clearly stated name:

    Example: Googlebot/2.1

  3. Email - Does the user agent provide an emailaddress where users can ask questions:

    Example: Netcraft SSL Server Survey - contact

  4. Version - Is the current version of the bot mentioned in the user agent:

    Example: CCBot/2.0 (

  5. Mozilla - Is there a mentioning of the compatibility with Mozilla:

    Example: Mozilla/4.0 (CMS Crawler:


The category Whois looks at five different variables:

  1. Public - Is there public information about the owner of the bot / main domain in the Whois registration
  2. Organization - Is the organization owning the domain mentioned in the Whois
  3. Country - Is the country mentioned in the Whois
  4. City - Is the city mentioned in the Whois
  5. Street - Is the street mentioned in the Whois


The category Weblinks looks at five different variables:

  1. User agent - Does the user agent provide a webpage:

    Example: (+

  2. Crawler - Is there a webpage dedicated to explaining what the bot does
  3. Homepage - Does the owner of the bot has a webpage
  4. Query - Is where a webpage where users can search for the collected data
  5. Adding - Is there a webpage where users can enter their domains to be crawled by the bot


The category Usage looks at five different variables:

  1. Recommended - Is the main goal of the bot clear and good enough to be recommended
  2. Category - To what category does the bot belong to
  3. Free Query - Do users have the ability to freely search after their own data
  4. Register - Is there a possibility for users to register
  5. Logo - Does the bot / main domain have a logo

Want to know more? You can find the details of many bots through the List of Bots or the Index of Bots?