Webbot activity on Jul 19, 2018

name activity 
Googlebotcrawler1 page 
  IP  log spam1 page 
  Faked UA string  (used as zombie)  hacking attempt2 pages 
Yandexsearch engine1 page 
Yandexsearch engine1 page 
Googlebotcrawler1 page 
  IP  log spam1 page 
  IP  log spam1 page 
Yandexsearch engine1 page 
bingbotcrawler1 page 
Googlebotcrawler1 page 
Googlebotcrawler1 page 
  Faked UA string  (used as zombie)  hacking attempt16 pages in  7'' 
Googlebotcrawler1 page 
  Faked UA string  (used as zombie)  hacking attempt2 pages in  3'' 
Googlebotcrawler1 page 
Yandexsearch engine1 page 
  IP  log spam1 page 
CCBotcrawler32 pages in  16' 27'' 
Googlebotcrawler1 page 
684,446 visits of identified bots
about 98 a day in 2014
10 today at 5:39 (+25 visitors)
zombies : 2 visits / 29 requests - spammer : 1 visit

What do these statistics mean?

They have been extracted from $_SERVER["HTTP_USER_AGENT"] -$HTTP_USER_AGENT with PHP 3- and gethostbyaddr().
As this website host (free.fr) sometimes filters access, they are biased.
The site statistics do not consider robots as visitors. The browser and country they show are ignored. Therefore it is quite easy to log the activity of those which are not banned.
Even if they read a few pages they are stored once except if they come back after more than 10 minutes (for Google Desktop after more than 30 minutes).
This list gives approximate information as it assumes a perfect connection to MySQL which is not the case for this site. But web hosting here is free, so...

Robot Detection

This routine is commented in the page about webbot traps.
Logging other visitors' user agent is necessary to update the lists of webbots visiting the site and of their User Agents.

The Data Table

Here is the structure of the table robots I use:
# Structure of the table `robots`
CREATE TABLE `robots` (
  `timeoflastaccess` int(10) unsigned NOT NULL default '0',
  `timeofarrival` int(10) unsigned NOT NULL default '0',
  `nameofbot` varchar(64) NOT NULL default '',
  `lastpage` varchar(30) NOT NULL default '',
  `numberofpages` mediumint(8) unsigned NOT NULL default '0',
  KEY `timeoflastaccess` (`timeoflastaccess`),
  KEY `timeofarrival` (`timeofarrival`),
  KEY `nameofbot` (`nameofbot`),
  KEY `numberofpages` (`numberofpages`)

You can use double or datetime for times, double or int for numberofpages, if necessary, increase the number of caracters for lastpage.

Table Update

Data Display

We have the name of the robot, the time of its arrival, the time of the last page it loaded and the total number of pages indexed.
If there is one page and different values for timeoflastaccess and timeofarrival, then the page was reloaded.
I chose to display the number of pages loaded and the length of reading time.

A similar script is now online here

topTop of the page

With javascript