5

How to block some robots and referer pages in Apache root level

view full story
linux-howto

http://serverfault.com – When I was using Lighttpd I can easily achieve this by such entries. So all websites were protected. Wget robots: $HTTP["useragent"] =~ "Wget" { $HTTP["url"] =~ "^/tagi(.*)" { # $HTTP["url"] =~ "" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^/tags(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^/kom.php(.*)" { url.access-deny = ( "" ) } $HTTP["querystring"] =~ "^(.*)strony(.*)" { url.access-deny = ( "" ) } $HTTP["querystring"] =~ "^(.*)page( (HowTos)