The Internet of Bots analyzes automated bots, spiders or crawlers based upon their visits to various websites. Every website gets visits from real persons and visits from bots. We are trying to make the details of these bots a little more transparent. We look at how the bot identifies itself and what website or company is behind it. We also look at its occurrence and behavior and give every bot a rank based upon these statistics. All this data is presented back in the List of Bots. We also mention the most interesting websites that employ bots (Try These), look at User Agents in general, try to anticipate on feeding the bots by looking at how a website can best be structured (Optimization) and offer a lot of Usefull Links.
There are several domains linked to The Internet of Bots that all share their user data to our central database. This data only consists of server statistics (no cookies and no Analytics). The bots that identify themselves in their user agent are filtered out and are checked by hand. All these bots get a short description and are being labeled, after which our automated script generates the statistics based on the collected user data.
When a bot doesn't identify itself in its user agent, it is not possible (or at least pretty hard) to look at the person(s), company or website behind it. This means that these bots are not included in the List of Bots. Although there are a lot of these bots, there are also much bigger and better websites for telling you about them.
The bots in the List of Bots are selected from our central database. Every time a bot returns, the central database and the statistics are updated. Statistics are not shown in real time but are updated every few months. This is to ensure the quality of the information and the enlisting of new bots. Since a part of the work is done by hand, it is possible that some of the information might update itself sooner or is presented wrong. Our goal is transparency and if you spot a correction or perhaps have a suggestion please let us know.