Surftrackr
Squid - Dansguardian log analizer
Description
Surftrackr is a log file viewer for Squid and Dan's Guardian. It makes it easy for you to monitor Web usage, including the type of information accessed, the sites visited by your web users, and the amount of information downloaded. Best of all, Surftrackr is free and runs in any web browser.
This is an alternative for Sarg.
Installation
Before install rpms, you must create a new ibay named surftrackr. Then run the following commands:
db accounts setprop surftrackr AllowOverride all db accounts setprop surftrackr FollowSymLinks enabled signal-event ibay-modify surftrackr
Create the database logs.
mysqladmin create logs --default-character-set=utf8 mysql GRANT ALL PRIVILEGES ON `logs` . * TO 'logsu'@'localhost' IDENTIFIED BY 'logsp' WITH GRANT OPTION ; quit
First download and install python 2.4 (skip this step if you have already Pootle or python 2.4 in your system):
wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/pootle/python2.4/rpms/alternatives-0.2.0rc3-9.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/pootle/python2.4/rpms/python24-2.4.2-10.el4.pyv.i386.rpm
yum localinstall *.rpm
Then run:
/usr/sbin/alternatives-helper --remove python24 mv /usr/bin/python.alternatives_save /usr/bin/python mv /usr/share/man/man1/python.1.gz.alternatives_save /usr/share/man/man1/python.1.gz
Now download the following rpms into another clean directory:
wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/Django-0.97_pre-1.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/mod_python-3.1.3-5.1.custom.i386.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/MySQL-python-1.2.2-1.i386.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/pygooglechart-0.2.2-1.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/pyparsing-1.5.1-1.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/python-dateutil-1.4.1-1.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/setuptools-0.6c9-1.noarch.rpm wget http://mirror.contribs.org/smeserver/contribs/nhall/sme7/contribs/surftrackr/rpms/surftrackr-20080326-1.noarch.rpm wget http://mirror.contribs.org/smeserver/releases/7/smecontribs/i386/RPMS/smeserver-mod_python-0.1-1.el4.sme.noarch.rpm
Install:
yum localinstall *.rpm
Into surftrackr html directory open the file settings.py and edit the following variables:
- LOG_FILE - If you have danguardian, change to /var/log/dansguardian/access.log and set DG logfileformat = 3 (squid format)
- DISABLE_AUTH - Look at http://surftrackr.net/blog/view/28/multi-user-surftrackr/ but leave as False
- LIVE_LOGGING and DATA_FROM_LOG - Look as Surftracker site, but leave as True and False respectivelly
- TIME_ZONE - It is important because set the local time to reports
- MEDIA_URL - Modify this line to your IP or FQDN
- SECRET_KEY - Change this key. DO NOT CHANGE THE LONGITUDE
Open the file .htaccess and comment require valid user and uncomment require user admin to allow only admin access.
Usage
You can use to LIVE logs or to proccess existing logs. The default allow to live logs in real time!
cd /home/e-smith/files/ibays/surftrackr/html/utils python2.4 surftrackr-livelog.py
Then go at your IP or FQDN with a browser and go to LIVE page and click at GO to see what happens.
If you want to proccess existing logs change LIVE_LOGGING to False and DATA_FROM_LOG to True
Then run
cd /home/e-smith/files/ibays/surftrackr/html/media python2.4 logfiles.py
You can set a cron job to do this at 5 or 10 minutes interval. The first time run this manually because Surftrackr will process a lot of old logs info.
It is VERY IMPORTANT to run the above script with python2.4 and NOT with python. Also it is very important to change directory to the script before runing. Do NOT run as python2.4 /home/e-smith/files/ibays/surftrackr/html/media/logfiles.py. Change to dir before run.
As this doc say, every user logged at Surftrackr is created at Django DB. So read all ST documents and blog to know all the features. Because this is not fully documented, I advice you to read all blog post entries from the first to last and download the PDF docs.
Uninstall
yum remove Django mod_python MySQL-python pygooglechart pyparsing python-dateutil setuptools surftrackr smeserver-mod_python
Remove the surftrackr ibay and the database logs and user logsu.