Hi
I would like to know if there is any way that https sites could show up in the web monitor page. Gmail is the reason. There is no way to block it on gargoyle and no way to monitor on log when it was accessed.
Is there a way to block https in access restrictions by blocking remote or local ports? If so, how? Even though the "Block" and "Block All" in "Restricted Resources" never worked for me. It would only work by using "All Network Access" and blocking my chosen http sites.
thank you!
Https ...again
Moderator: Moderators
- DoesItMatter
- Moderator
- Posts: 1373
- Joined: Thu May 21, 2009 3:56 pm
Re: Https ...again
http://community.norton.com/t5/Norton-I ... /m-p/78568
http://superuser.com/questions/43571/lo ... ithout-ssl
Some info for gmail secure and ports used.
You could block all access to the secure ports, but be aware,
you probably won't be able to do any secure online banking
or shopping either unless you specifically allow certain websites
like your bank address or amazon.com or ebay.com, etc.
http://superuser.com/questions/43571/lo ... ithout-ssl
Some info for gmail secure and ports used.
You could block all access to the secure ports, but be aware,
you probably won't be able to do any secure online banking
or shopping either unless you specifically allow certain websites
like your bank address or amazon.com or ebay.com, etc.


2x Asus RT-N16 = Asus 3.0.0.4.374.43 Merlin
2x Buffalo WZR-HP-G300NH V1 A0D0 = Gargoyle 1.9.x / LEDE 17.01.x
2x Engenius - ESR900 Stock 1.4.0 / OpenWRT Trunk 49400
Re: Https ...again
Thank you for your reply.
Is it possible to explain to me how to block https on just that one machine without blocking it completely for me too using Gargoyle?
I prefer not to use NIS.
I tried blocking it using the application protocol (SSL) in the firewall but did not block it.
Is it possible to explain to me how to block https on just that one machine without blocking it completely for me too using Gargoyle?
I prefer not to use NIS.
I tried blocking it using the application protocol (SSL) in the firewall but did not block it.
Re: Https ...again
Instead of matching based on Application Protocol (which is time-consuming and often inaccurate method of matching), try blocking port 443.
Re: Https ...again
Hello Eric,
My final question to you is this. (i promise)
I realize now that https is needed for a school website, so I am unable to block https completely.
Could you (or anyone) please explain step by step how to allow just 1 specific https website to run while blocking other https sites? (preferably on just 1 machine)
The only https site I would like to allow is pupilpath
Thank you for your help!
My final question to you is this. (i promise)
I realize now that https is needed for a school website, so I am unable to block https completely.
Could you (or anyone) please explain step by step how to allow just 1 specific https website to run while blocking other https sites? (preferably on just 1 machine)
The only https site I would like to allow is pupilpath
Thank you for your help!
- DoesItMatter
- Moderator
- Posts: 1373
- Joined: Thu May 21, 2009 3:56 pm
Re: Https ...again
O M G ... I am never having kids - LOL
It may not be P.C. but I would be smacking my kids!
I THINK this is how you do it heather... I don't usually setup
exception rules, but I think this should work for you.
This is under Firewall -> Restrictions -> Exceptions (White List)
It may not be P.C. but I would be smacking my kids!
I THINK this is how you do it heather... I don't usually setup
exception rules, but I think this should work for you.
This is under Firewall -> Restrictions -> Exceptions (White List)
- Attachments
-
- pupil-path.jpg (52.94 KiB) Viewed 6555 times


2x Asus RT-N16 = Asus 3.0.0.4.374.43 Merlin
2x Buffalo WZR-HP-G300NH V1 A0D0 = Gargoyle 1.9.x / LEDE 17.01.x
2x Engenius - ESR900 Stock 1.4.0 / OpenWRT Trunk 49400
Re: Https ...again
Hi There DoesItMatter!
Thank you for the easy to follow instructions.
It looked promising, but when I use that setting it seemed to have no effect. After a while a few http websites would not load. I forgot that I had made that whitelist rule and when I went to gargoyle-router.com it was down, and I was like OOOHH NOOOOOO! But I then remembered the whitelist rule, and unchecked it and saved settings and everything loaded properly.
I am currently still running version 1.3.7 but that should not make a difference. Tried 2 different browsers, reboot the router, closed & opened the blowser etc. PS- Don't have kids! When that boy interest thing kicks in and they start challenging you and being rebellious...good grief charlie batman.
Thank you for the easy to follow instructions.
It looked promising, but when I use that setting it seemed to have no effect. After a while a few http websites would not load. I forgot that I had made that whitelist rule and when I went to gargoyle-router.com it was down, and I was like OOOHH NOOOOOO! But I then remembered the whitelist rule, and unchecked it and saved settings and everything loaded properly.
I am currently still running version 1.3.7 but that should not make a difference. Tried 2 different browsers, reboot the router, closed & opened the blowser etc. PS- Don't have kids! When that boy interest thing kicks in and they start challenging you and being rebellious...good grief charlie batman.
- DoesItMatter
- Moderator
- Posts: 1373
- Joined: Thu May 21, 2009 3:56 pm
Re: Https ...again
I've got some vacation time coming up within the next day or two.
I'll mess around with this and see if I can figure out this stuff.
I never block anything, but I know its an issue for quite a few people.
Maybe this way, we can get some sets of rules down, and add it
to the wiki document portion.
Might be easier to replicate if I know what settings you currently
have for your blacklist / whitelist parts?
If you don't want to post them here, you can always send them
in a PM - or just photoshop any stuff you don't want shown.
I'll mess around with this and see if I can figure out this stuff.
I never block anything, but I know its an issue for quite a few people.
Maybe this way, we can get some sets of rules down, and add it
to the wiki document portion.
Might be easier to replicate if I know what settings you currently
have for your blacklist / whitelist parts?
If you don't want to post them here, you can always send them
in a PM - or just photoshop any stuff you don't want shown.


2x Asus RT-N16 = Asus 3.0.0.4.374.43 Merlin
2x Buffalo WZR-HP-G300NH V1 A0D0 = Gargoyle 1.9.x / LEDE 17.01.x
2x Engenius - ESR900 Stock 1.4.0 / OpenWRT Trunk 49400
Re: Https ...again
DoesItMatter: That won't work. As I think I've stated before you can't block a https site using the site name like you can for sites that are not encrypted. The reason is that in order to know the name of the site requested you have to look inside the packet... which in the case of https is encrypted. It's not a limitation of my program that a little bit of clever code could fix -- it's just the reality of encrypted connections.
However, there may be a solution, but it's not perfect. Every site on the internet has an ip address which identifies where the browser is supposed to look for it. If you block/allow the right IP (which you don't need to look inside the encrypted packet to determine) you can specify specific sites for which to allow/disallow https.
The problem is that there can be more than one site on a given IP (in which case you would block too many sites), or there might be multiple IPs for a given site (in which case you would have to block/allow all of them for this to work). Also, sometimes the IP a site is using can switch. These drawbacks make blocking websites by IP somewhat problematic.
However, it looks like the website you're interested in -- pupilpath.com -- has exactly one IP, which right now is: 184.106.39.84 In case you're wondering, I found this by using the nslookup utility in linux. Your best bet is to apply your https blocking rule to all remote IPs EXCEPT 184.106.39.84. That's the best you can do, I think. If the IP changes, you'll have to change the rule.
However, there may be a solution, but it's not perfect. Every site on the internet has an ip address which identifies where the browser is supposed to look for it. If you block/allow the right IP (which you don't need to look inside the encrypted packet to determine) you can specify specific sites for which to allow/disallow https.
The problem is that there can be more than one site on a given IP (in which case you would block too many sites), or there might be multiple IPs for a given site (in which case you would have to block/allow all of them for this to work). Also, sometimes the IP a site is using can switch. These drawbacks make blocking websites by IP somewhat problematic.
However, it looks like the website you're interested in -- pupilpath.com -- has exactly one IP, which right now is: 184.106.39.84 In case you're wondering, I found this by using the nslookup utility in linux. Your best bet is to apply your https blocking rule to all remote IPs EXCEPT 184.106.39.84. That's the best you can do, I think. If the IP changes, you'll have to change the rule.