PHP Script to grab webmon CSV logs.
Posted: Sat Dec 01, 2012 8:27 pm
Hi there everyone.
First, let me congratulate the Gargoyle team for the great project they have here. I was using OpenWRT with Luci and moved over to Gargoyle and I must say it was a really good improvement. It makes a lot of things easier.
I moved to Gargoyle because of the built-in Web Usage logging feature and I am developing a script in PHP that grabs the logs from time to time and inserts them into a mysql database for further processing and reporting.
I struggled a few hours to find a way to login into the web-interface and download the CSV logs using PHP and CURL library since the standard CURL method for handling cookies (using CookieJar) does not work with the Gargoyle Web interface due to some problem with the cookies created by the /utility/get_password_cookie.sh script. I had to grab the cookie from the header and convert it into a format that can be used in further CURL requests.
I am here to share that script with all of you.
The script attached will download the webmon_domains.csv and webmon_searches.csv files, convert them into an array and output them. If you want to process the data, you just need to iterate the array and do whatever you want with it.
Please remember to edit the script and set the host and password.
Thank you and have a great day.
First, let me congratulate the Gargoyle team for the great project they have here. I was using OpenWRT with Luci and moved over to Gargoyle and I must say it was a really good improvement. It makes a lot of things easier.
I moved to Gargoyle because of the built-in Web Usage logging feature and I am developing a script in PHP that grabs the logs from time to time and inserts them into a mysql database for further processing and reporting.
I struggled a few hours to find a way to login into the web-interface and download the CSV logs using PHP and CURL library since the standard CURL method for handling cookies (using CookieJar) does not work with the Gargoyle Web interface due to some problem with the cookies created by the /utility/get_password_cookie.sh script. I had to grab the cookie from the header and convert it into a format that can be used in further CURL requests.
I am here to share that script with all of you.
The script attached will download the webmon_domains.csv and webmon_searches.csv files, convert them into an array and output them. If you want to process the data, you just need to iterate the array and do whatever you want with it.
Please remember to edit the script and set the host and password.
Use it at your own will...$router = 'http://192.168.0.1'; //Location of the Router webinterface
$pass = 'password'; //Router login password
Thank you and have a great day.