Difference between revisions of "Squid"
(15 intermediate revisions by 7 users not shown) | |||
Line 1: | Line 1: | ||
==Squid Proxy Server Settings== | ==Squid Proxy Server Settings== | ||
− | {{ | + | {{Level|medium}} |
+ | |||
===SARG: Squid Analysis Report Generator=== | ===SARG: Squid Analysis Report Generator=== | ||
"Squid Analysis Report Generator is a tool that allow you to view "where" your users are going to on the Internet." | "Squid Analysis Report Generator is a tool that allow you to view "where" your users are going to on the Internet." | ||
Line 7: | Line 8: | ||
SME SARG HowTo: [[Sarg]] | SME SARG HowTo: [[Sarg]] | ||
+ | |||
This section should contain links to the various Squid modifiers - Squidguard, Dansguardian, Sarg, etc. | This section should contain links to the various Squid modifiers - Squidguard, Dansguardian, Sarg, etc. | ||
+ | |||
===DansGuardian: true web content filtering for all=== | ===DansGuardian: true web content filtering for all=== | ||
"DansGuardian is an award winning web content filtering proxy(1) for Linux, FreeBSD, OpenBSD, NetBSD, Mac OS X, HP-UX, and Solaris that uses Squid(2) to do all the fetching. It filters using multiple methods. These methods include URL and domain filtering, content phrase filtering, PICS filtering, MIME filtering, file extension filtering, POST limiting." | "DansGuardian is an award winning web content filtering proxy(1) for Linux, FreeBSD, OpenBSD, NetBSD, Mac OS X, HP-UX, and Solaris that uses Squid(2) to do all the fetching. It filters using multiple methods. These methods include URL and domain filtering, content phrase filtering, PICS filtering, MIME filtering, file extension filtering, POST limiting." | ||
Line 20: | Line 23: | ||
Squidguard Homepage: http://www.squidguard.org/ | Squidguard Homepage: http://www.squidguard.org/ | ||
− | SME SquidGuard Howto: [[SquidGuard]] | + | SME SquidGuard Howto: [[SquidGuard]] or [[WebFiltering]] |
===ProxyPass=== | ===ProxyPass=== | ||
Line 35: | Line 38: | ||
====Caching WindowsUpdate download (and others too)==== | ====Caching WindowsUpdate download (and others too)==== | ||
− | + | From this post | |
+ | * http://forums.contribs.org/index.php?topic=34812.0 | ||
+ | and | ||
+ | * http://forums.contribs.org/index.php?topic=40391.0 | ||
+ | |||
+ | Create the template fragment | ||
+ | mkdir -p /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | nano -w /etc/e-smith/templates-custom/etc/squid/squid.conf/05refreshpattern | ||
+ | paste lines below | ||
− | |||
hierarchy_stoplist cgi-bin ? | hierarchy_stoplist cgi-bin ? | ||
acl QUERY urlpath_regex cgi-bin \? | acl QUERY urlpath_regex cgi-bin \? | ||
Line 46: | Line 56: | ||
cache_dir ufs /var/spool/squid 3000 16 256 | cache_dir ufs /var/spool/squid 3000 16 256 | ||
− | # various windows versions | + | # caching windows update various windows versions |
refresh_pattern http://.*\.windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims | refresh_pattern http://.*\.windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims | ||
refresh_pattern http://.*\.update\.microsoft\.com/ 0 80% 20160 reload-into-ims | refresh_pattern http://.*\.update\.microsoft\.com/ 0 80% 20160 reload-into-ims | ||
Line 52: | Line 62: | ||
refresh_pattern http://windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims | refresh_pattern http://windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims | ||
refresh_pattern http://office\.microsoft\.com/ 0 80% 20160 reload-into-ims | refresh_pattern http://office\.microsoft\.com/ 0 80% 20160 reload-into-ims | ||
− | * | + | |
+ | # cache ubuntu updates [check logs use COUNTRY SPECIFIC first line or generic below] | ||
+ | refresh_pattern http://.*\.archive\.ubuntu\.com 0 80% 20160 reload-into-ims | ||
+ | refresh_pattern http://archive\.ubuntu\.com 0 80% 20160 reload-into-ims | ||
+ | |||
+ | # add any site you want to cache below | ||
+ | |||
Execute: | Execute: | ||
+ | expand-template /etc/squid/squid.conf | ||
squid -k reconfigure | squid -k reconfigure | ||
+ | |||
+ | |||
+ | Reference: | ||
+ | cache_mem | ||
+ | Description - The amount of memory; RAM, to be used for caching the so called: In-Transit objects, Hot Objects, Negative-Cached objects. This is an optimization feature. Squid can use much more memory than the value specified in this parameter, if you have 48 MB free for Squid, put 48/3 = 16 MB. The value is specified in megabytes. | ||
+ | maximum_object_size | ||
+ | Description - Objects larger than this size will NOT be saved on disk. The value is specified in kilobytes. | ||
+ | cache_dir | ||
+ | Description - Specifies in this order, | ||
+ | Which kind of storage system to use; ufs | ||
+ | The name of the cache directory; /var/spool/squid | ||
+ | The disk space in megabytes to use under this directory; 3000 Megabytes | ||
+ | The number of first-level subdirectories to be created under the cache directory; 16 Level-1 | ||
+ | The number of second-level subdirectories to be created under each first-level cache directory; 256 Level-2 | ||
+ | |||
+ | ====Content Encoding Error ==== | ||
+ | The problem here is squid that comes with SME Server 7.x is version 2.5 which has lack of HTTP/1.1 support. SME 8 has a later version of and solves this issue. See [[Bugzilla 6058]] | ||
+ | |||
+ | As a workaround you will need to create a few custom-templates and use squid's acl rules. | ||
+ | |||
+ | Create a file called 21BrokenHeader in the following directory (create if doesn't exist) | ||
+ | |||
+ | /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | |||
+ | Enter the following line in 21BrokenHeader | ||
+ | |||
+ | acl broken dstdomain www.maplin.co.uk | ||
+ | |||
+ | Save the file | ||
+ | |||
+ | If it does not already exist create a file called 40http_access75AllowLocal in the following directory | ||
+ | |||
+ | /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | |||
+ | Enter the following line in 40http_access75AllowLocal | ||
+ | |||
+ | header_access Accept-Encoding deny broken | ||
+ | |||
+ | Save and quit, next expand the files: | ||
+ | |||
+ | expand-template /etc/squid/squid.conf | ||
+ | |||
+ | and restart the squid service: | ||
+ | |||
+ | sv t /service/squid/ | ||
+ | |||
+ | ====How do I block access to (Facebook|Twitter|whatever) that runs on https?==== | ||
+ | |||
+ | Nowadays many sites work only using https protocol; we can't filter their content but we can block access to them | ||
+ | |||
+ | From this post | ||
+ | * http://forums.contribs.org/index.php/topic,51474.msg261561.html#msg261561 | ||
+ | |||
+ | Create the rigth path into /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | |||
+ | mkdir -p /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | |||
+ | move into the new path | ||
+ | |||
+ | cd /etc/e-smith/templates-custom/etc/squid/squid.conf | ||
+ | |||
+ | create a new fragment 20ACL40bannedsites | ||
+ | |||
+ | nano 20ACL40bannedsites | ||
+ | |||
+ | it's content must be (for example, to block Facebook) | ||
+ | |||
+ | acl bannedsites dstdomain .facebook.com | ||
+ | |||
+ | Domains to be blocked can be many, just put them in the same line, separated by a space | ||
+ | Save and exit with Ctrl-X, Y | ||
+ | |||
+ | create another fragment 40http_access15denyconnectBannedsites | ||
+ | |||
+ | nano 40http_access15denyconnectBannedsites | ||
+ | |||
+ | with this content | ||
+ | |||
+ | http_access deny CONNECT bannedsites | ||
+ | |||
+ | Save and exit with Ctrl-X, Y | ||
+ | |||
+ | Now, invoke proxy-update event | ||
+ | |||
+ | signal-event proxy-update | ||
+ | |||
+ | Tested and working on SME8.X and SME9 | ||
+ | |||
+ | ====Allow squid custom file descriptor limit==== | ||
+ | The new default limit is 4096, and a custom value can be set with: | ||
+ | |||
+ | db configuration setprop squid MaxFileDesc 8192 | ||
+ | expand-template /etc/squid/squid.conf | ||
+ | sv t /service/squid | ||
---- | ---- | ||
[[Category:Howto]] | [[Category:Howto]] | ||
+ | [[Category:Administration]] |
Latest revision as of 08:44, 31 January 2017
Squid Proxy Server Settings
SARG: Squid Analysis Report Generator
"Squid Analysis Report Generator is a tool that allow you to view "where" your users are going to on the Internet."
SARG Homepage: http://sarg.sourceforge.net/
SME SARG HowTo: Sarg
This section should contain links to the various Squid modifiers - Squidguard, Dansguardian, Sarg, etc.
DansGuardian: true web content filtering for all
"DansGuardian is an award winning web content filtering proxy(1) for Linux, FreeBSD, OpenBSD, NetBSD, Mac OS X, HP-UX, and Solaris that uses Squid(2) to do all the fetching. It filters using multiple methods. These methods include URL and domain filtering, content phrase filtering, PICS filtering, MIME filtering, file extension filtering, POST limiting."
DansGuardian Homepage: http://dansguardian.org/
SME DansGuardian HowTo: Dansguardian
SquidGuard
"SquidGuard is a URL redirector used to use blacklists with the proxysoftware Squid."
Squidguard Homepage: http://www.squidguard.org/
SME SquidGuard Howto: SquidGuard or WebFiltering
ProxyPass
Despite the name, ProxyPass has nothing to do with the Squid proxy and is actually an Apache directive designed to allow a given URL on your server to return the content of some other webserver - either internal (behind your SME) or external.
Apache proxypass Documentation: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html#proxypass
SME proxypass Howto: SME_Server:Documentation:FAQ#Proxy_Pass
Customizing Squid
Bypass the Proxy Server Without Disabling Your Transparent Proxy
Caching WindowsUpdate download (and others too)
From this post
and
Create the template fragment
mkdir -p /etc/e-smith/templates-custom/etc/squid/squid.conf nano -w /etc/e-smith/templates-custom/etc/squid/squid.conf/05refreshpattern
paste lines below
hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache deny QUERY cache_mem 16 MB maximum_object_size 1280096 KB cache_dir ufs /var/spool/squid 3000 16 256 # caching windows update various windows versions refresh_pattern http://.*\.windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims refresh_pattern http://.*\.update\.microsoft\.com/ 0 80% 20160 reload-into-ims refresh_pattern http://download\.microsoft\.com/ 0 80% 20160 reload-into-ims refresh_pattern http://windowsupdate\.microsoft\.com/ 0 80% 20160 reload-into-ims refresh_pattern http://office\.microsoft\.com/ 0 80% 20160 reload-into-ims # cache ubuntu updates [check logs use COUNTRY SPECIFIC first line or generic below] refresh_pattern http://.*\.archive\.ubuntu\.com 0 80% 20160 reload-into-ims refresh_pattern http://archive\.ubuntu\.com 0 80% 20160 reload-into-ims
# add any site you want to cache below
Execute:
expand-template /etc/squid/squid.conf squid -k reconfigure
Reference:
cache_mem Description - The amount of memory; RAM, to be used for caching the so called: In-Transit objects, Hot Objects, Negative-Cached objects. This is an optimization feature. Squid can use much more memory than the value specified in this parameter, if you have 48 MB free for Squid, put 48/3 = 16 MB. The value is specified in megabytes. maximum_object_size Description - Objects larger than this size will NOT be saved on disk. The value is specified in kilobytes. cache_dir Description - Specifies in this order, Which kind of storage system to use; ufs The name of the cache directory; /var/spool/squid The disk space in megabytes to use under this directory; 3000 Megabytes The number of first-level subdirectories to be created under the cache directory; 16 Level-1 The number of second-level subdirectories to be created under each first-level cache directory; 256 Level-2
Content Encoding Error
The problem here is squid that comes with SME Server 7.x is version 2.5 which has lack of HTTP/1.1 support. SME 8 has a later version of and solves this issue. See Bugzilla 6058
As a workaround you will need to create a few custom-templates and use squid's acl rules.
Create a file called 21BrokenHeader in the following directory (create if doesn't exist)
/etc/e-smith/templates-custom/etc/squid/squid.conf
Enter the following line in 21BrokenHeader
acl broken dstdomain www.maplin.co.uk
Save the file
If it does not already exist create a file called 40http_access75AllowLocal in the following directory
/etc/e-smith/templates-custom/etc/squid/squid.conf
Enter the following line in 40http_access75AllowLocal
header_access Accept-Encoding deny broken
Save and quit, next expand the files:
expand-template /etc/squid/squid.conf
and restart the squid service:
sv t /service/squid/
How do I block access to (Facebook|Twitter|whatever) that runs on https?
Nowadays many sites work only using https protocol; we can't filter their content but we can block access to them
From this post
Create the rigth path into /etc/e-smith/templates-custom/etc/squid/squid.conf
mkdir -p /etc/e-smith/templates-custom/etc/squid/squid.conf
move into the new path
cd /etc/e-smith/templates-custom/etc/squid/squid.conf
create a new fragment 20ACL40bannedsites
nano 20ACL40bannedsites
it's content must be (for example, to block Facebook)
acl bannedsites dstdomain .facebook.com
Domains to be blocked can be many, just put them in the same line, separated by a space Save and exit with Ctrl-X, Y
create another fragment 40http_access15denyconnectBannedsites
nano 40http_access15denyconnectBannedsites
with this content
http_access deny CONNECT bannedsites
Save and exit with Ctrl-X, Y
Now, invoke proxy-update event
signal-event proxy-update
Tested and working on SME8.X and SME9
Allow squid custom file descriptor limit
The new default limit is 4096, and a custom value can be set with:
db configuration setprop squid MaxFileDesc 8192 expand-template /etc/squid/squid.conf sv t /service/squid