AWStats cron job'awstats.pl'需要24小时以上

我们在读取access.log文件的AWStats后台作业上遇到了麻烦: awstats.pl脚本花费了24小时以上 ,似乎永远不会结束。

我们有一个每天浏览量超过800万页的网站,每天生成一个2 GB Apache access.log文件。

下面是我们试图手动运行awstats.pl脚本时的命令日志:

 root@hostname:~# /usr/lib/cgi-bin/awstats.pl -config=org.mysite -update Create/Update database for config "/etc/awstats/awstats.org.mysite" by AWStats version 7.4 (build 20150714) From data in log file "/var/log/apache2/org.mysite-access.log"... Phase 1 : First bypass old records, searching new record... Direct access to last remembered record is out of file. So searching it from beginning of log file... Phase 2 : Now process new records (Flush history on disk after 20000 hosts)... Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) Flush history file on disk (unique url reach flush limit of 5000) ^C 

我们使用ctrl+C在这里停止,因为它已经被用了1个多小时。

我们试图禁用DNS查找(设置为0 ),以及它没有帮助:

编辑文件/etc/awstats/awstats.org.mysite.conf

  # 0 - No DNS Lookup # 1 - DNS Lookup is fully enabled # 2 - DNS Lookup is made only from static DNS cache file (if it exists) # Default: 2 DNSLookup=0 

硬件不应该是瓶颈,它是ovh.com HOST-128L专用服务器

  • 英特尔至强D-1520 – 4核/ 8线程
  • 128GB DDR4 ECC 2133 MHz
  • 2 x 480 GB / SSD

操作系统是Ubuntu 16.04.3 LTS

 root@hostname:~# cat /etc/*-release NAME="Ubuntu" VERSION="16.04.3 LTS (Xenial Xerus)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 16.04.3 LTS" VERSION_ID="16.04" VERSION_CODENAME=xenial 

所以,

  1. 我们有什么可以改善AWStats的性能吗?
  2. 还是我们达到了AWStats的性能门槛?