PHP Script to grab webmon CSV logs.

Want to share your OpenWrt / Gargoyle knowledge? Implemented a new feature? Let us know here.

Moderator: Moderators

Post Reply
hugo.rosario
Posts: 1
Joined: Sat Dec 01, 2012 8:04 pm

PHP Script to grab webmon CSV logs.

Post by hugo.rosario »

Hi there everyone.
First, let me congratulate the Gargoyle team for the great project they have here. I was using OpenWRT with Luci and moved over to Gargoyle and I must say it was a really good improvement. It makes a lot of things easier.
I moved to Gargoyle because of the built-in Web Usage logging feature and I am developing a script in PHP that grabs the logs from time to time and inserts them into a mysql database for further processing and reporting.
I struggled a few hours to find a way to login into the web-interface and download the CSV logs using PHP and CURL library since the standard CURL method for handling cookies (using CookieJar) does not work with the Gargoyle Web interface due to some problem with the cookies created by the /utility/get_password_cookie.sh script. I had to grab the cookie from the header and convert it into a format that can be used in further CURL requests.
I am here to share that script with all of you.

The script attached will download the webmon_domains.csv and webmon_searches.csv files, convert them into an array and output them. If you want to process the data, you just need to iterate the array and do whatever you want with it.
Please remember to edit the script and set the host and password.
$router = 'http://192.168.0.1'; //Location of the Router webinterface
$pass = 'password'; //Router login password
Use it at your own will...

Thank you and have a great day.
Attachments
gargoyle.zip
(1.36 KiB) Downloaded 395 times

ispyisail
Moderator
Posts: 5185
Joined: Mon Apr 06, 2009 3:15 am
Location: New Zealand

Re: PHP Script to grab webmon CSV logs.

Post by ispyisail »

Thanks

justin
Posts: 2
Joined: Tue Apr 23, 2013 9:44 am

Re: PHP Script to grab webmon CSV logs.

Post by justin »

Slightly updated version of the script, as the other wasn't properly grabbing the hash and expiration date for me. I placed a definite length on both of them. This means 2 things:
1) If the hash used changes (in type or length ,very unlikely) this won't work
2) Somewhere in the year 2286, this won't work

Additionally, my router was being kind of a jerk so I had to tell curl to not worry so much about my everything just being hacked super hard. You can do this with the following two lines in the curl calls:
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);

Again, try to avoid the two curl additions, but for me it wasn't an issue.

EDIT: I just remembered, I also added an array to download all 'useful' files from the router. Additionally, I accidentally left a var_dump in there... But I can't be arsed to change it now. :)

~Justin
Attachments
gargoyle.gz
(1.2 KiB) Downloaded 430 times

Post Reply