[Server-devel] Daily Reporting on traffic and DHCP leases

Juan G. Narvaez gnrvzsix at gmail.com
Tue Apr 20 22:02:25 EDT 2010


Wow! This is very useful!

I was thinking on how to generate a networking report a few days ago...
tomorrow i will start to the implementation

Thank you very much!!!

J. Guillermo Narváez
OLPC XS Implementation Team - La Rioja | Argentina


On Tue, Apr 20, 2010 at 8:52 PM, Anna <aschoolf at gmail.com> wrote:

> First of all, this is very kludgy and rather embarrassing, but I thought
> I'd post it in case other folks wanted to expand on the idea or someone out
> there needed a similar report.
>
> Anyway, on XS 0.6, I managed to cobble together a daily report that looks
> like this:
>
>     <schoolname> 04-15-2010 XOs:37 Other:7
>
>     eth0 (faces the internet)
>     _____Received___Transmitted___Total
>     today 573.00 MB | 46.22 MB | 619.23 MB
>
>     eth1 (to the XOs)
>     _____Received___Transmitted___Total
>     today 70.66 MB | 702.79 MB | 773.45 MB
>
> Basically I wanted daily reporting on how many unique XOs and other devices
> (iPhones, laptops, etc) got dhcp leases and about how much traffic went
> through the tubes. The report needs to be short and clear enough so that
> non-technical folks can understand it.  Also, I wanted a gist of the
> differential in traffic due to Squid caching, which can be seen in the above
> example.
>
> My situation here is complicated by the APs getting dynamic dhcp leases.
> Yes, I know, but they're not under my control and I've given up on that
> point as long as its working.  I don't want them showing up as "Other"
> devices, as that's not an accurate count of what's going on, so I had to
> account for them in the report.
>
> It's a couple of embarrassingly simple scripts in cron.  Here are the
> dependencies I'm using:
>
> dhcpstatus    http://dhcpstatus.sourceforge.net/
> nmap
> vnstat
> Optional: A working mail server (I'm using ssmtp with gmail)
>
> There are a couple of edits to the dhcpstatus configuration files, but
> nothing major and it's clear what to do in the README.  If you don't want to
> email the report, it's easy enough to throw into a text file for Apache or
> whatever.
>
> Once I configured dhcpstatus, I created a directory for the data in
> /usr/local/dhcpstatus and did this silly little script in /usr/local/bin,
> making sure that /usr/local/bin was in my path in the crontab.
>
> [root at schoolserver ~]# cat /usr/local/bin/hourly-dhcp-lease-dump
> #!/bin/bash
> #Pull a list of all the Mac Addresses with current dhcp leases and dump
> into a file
> dhcpstatus -s 172.18.96.0 | grep Mac | awk '{print $3}' | sort | uniq >>
> /usr/local/dhcpstatus/data/hourlydump
>
> I have it run every hour from 6:30 AM to 6:30 PM.  Probably overkill, but
> it only takes a few seconds.
>
> 30 6-18 * * * /usr/local/bin/hourly-dhcp-lease-dump >/dev/null 2>&1
>
> Also in /usr/local/bin, I did up the script for the report.  I put lines
> around it as its rather long.
>
> ________________________________________________________________________________
>
> [root at schoolserver ~]# cat /usr/local/bin/daily-dhcp-lease-report
> #!/bin/bash
>
> ##### Who do we want to send this to?  Separate multiple email addresses
> with a comma
> EMAIL="me at gmail.com,you at gmail.com"
>
> #####File Location Variables
>
> # This is where the hourly MAC address dumps have been going all day
> DATA="/usr/local/dhcpstatus/data/hourlydump"
>
> # This is where we keep all the counts for historical purposes
> HISTORY="/usr/local/dhcpstatus/data/history"
>
> ##### Define some odds and ends
>
> # Good enough range to get any APs on the network
> RANGE="172.18.96.0/24 172.18.97.0/24 172.18.98.0/24 172.18.99.0/24"
>
> # Pull the school name out of the hostname.
> SERVER=`hostname | awk "-F." '{print $2}'`
>
> ##### Dealing with data ...
>
> #-----Getting information on the network via nmap----
> # The only things on the network with open telnet ports should be the
> access points
> # If you don't need this, then you don't need to run this script as root
> AP=`nmap -sS -p 23 $RANGE | grep -c open`
>
> # Make sure the daily dump file doesn't have empty lines
> sed -n '/^$/d' $DATA
>
> # Get a count of all the unique XOs in the hourly dump file
> # This will change with the XO 1.5
> XO=`cat $DATA | sort | uniq | grep -c 00:17:c4`
>
> # Get a count of all the unique devices in the hourly dump file
> ALL=`cat $DATA | sort | uniq | wc -l`
>
> # Calculate the number of non-XO Devices
> OTHER="$(($ALL - $XO - $AP))"
>
> ##### Create the body of the email with the network stats
>
> vnstat -i eth0 > /tmp/eth0
> vnstat -i eth1 > /tmp/eth1
> echo "eth0 (faces the internet)" > /tmp/stats
> #Edit this line if stuff isn't lining up right
> echo "___Received___Transmitted___Total" >> /tmp/stats
> grep today /tmp/eth0 | head -n 1 >> /tmp/stats
> echo >> /tmp/stats
> echo "eth1 (to the XOs)" >> /tmp/stats
> #Edit this line if stuff isn't lining up right
> echo "___Received___Transmitted___Total" >> /tmp/stats
> grep today /tmp/eth1 | head -n 1 >> /tmp/stats
>
> ###### Send the report and log the data
>
> # This is the email subject line
> STATUS="$SERVER $(date +%m-%d-%Y) XOs:$XO Other:$OTHER"
>
> # Log the daily status in the history file
> echo $STATUS >> $HISTORY
>
> # Email the daily status email.  Also works for blog posts.
> mail -s "$STATUS" $EMAIL < /tmp/stats
>
> ##### Cleanup for tomorrow
>
> #Clean out the hourlydump file for the next day
> rm -f $DATA > /dev/null
>
> ________________________________________________________________________________
>
> I have that in the crontab for 6:45 PM so it's ready for night owls and
> early risers.
>
> 45 18 * * * /usr/local/bin/daily-dhcp-lease-report
>
> Rotating the history file really isn't a priority as its one short line of
> text per day.
>
> Thoughts and criticisms are more than welcome.
>
> Anna Schoolfield
> Birmingham
>
>
> _______________________________________________
> Server-devel mailing list
> Server-devel at lists.laptop.org
> http://lists.laptop.org/listinfo/server-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.laptop.org/pipermail/server-devel/attachments/20100420/e026b423/attachment-0001.htm 


More information about the Server-devel mailing list