Hi, I receive a lot o log messages like the one below. I guess this regards some Microsoft robot, which is passed through, but cannot reach apache properly on my SuSE 11.2 server and it won’t give up.
The log level is “Log only critical” and the messages disappear if I switch off logging for accepted packets.
Could somebody help me please, to undestand why the firewall is assuming these packets are critical ? My apache2 server is otherwise working properly.
Post the output here and we’ll see the specific rule that actually does
the logging.
Good luck.
zlisiecki wrote:
> Hi, I receive a lot o log messages like the one below. I guess this
> regards some Microsoft robot, which is passed through, but cannot reach
> apache properly on my SuSE 11.2 server and it won’t give up.
>
> Jan 28 06:45:33 xxxx kernel: [370019.686176] SFW2-INext-ACC-TCP IN=eth2
> OUT= MAC=00:0c:f1:c7:39:3c:00:50:7f:c0:43:88:08:00 SRC=65.55.216.33
> DST=192.168.1.17 LEN=48 TOS=0x00 PREC=0x00 TTL=116 ID=11944 DF PROTO=TCP
> SPT=20098 DPT=80 WINDOW=65535 RES=0x00 SYN URGP=0 OPT
> (0204057A01010402)
>
> The log level is “Log only critical” and the messages disappear if I
> switch off logging for accepted packets.
> Could somebody help me please, to undestand why the firewall is
> assuming these packets are critical ? My apache2 server is otherwise
> working properly.
>
> Thanx in advance for your answere
> Zbyszek
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
Yes, I see, its’ limit 3/min which causes the logging, am I right ?
Microsoft robot is too fast. The crawl deley parameter in robots.txt is responsible.
But I set:
User-Agent: *
Crawl-delay: 30
Request-rate: 1/5
As shown in those links posted earlier, the bots do not respect the
robots.txt file, which is why they should be blocked completely until they
grow up.
Good luck.
zlisiecki wrote:
> Yes, I see, its’ limit 3/min which causes the logging, am I right ?
> Microsoft robot is too fast. The crawl deley parameter in robots.txt is
> responsible.
> But I set:
> User-Agent: *
> Crawl-delay: 30
> Request-rate: 1/5
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
Ok, I could deny msnbot access, but now suddenly a lot others appeared. This is the sample list in 10 minutes intervall with the number of logfile lines for each:
google and msnbot are 66.* and 65.* . Others seem to come from ISPs users with dynamic IPs. I’d like to deny access to all IPs which violate my robots.txt settings. But do you often have such situation too ? Should I interpret this as some attack ?
There is I believe a place in SuSEfirewall2 for custom rules, but you’d have to read the comments, I use a separate firewall box so I don’t use SF2.
Be careful you don’t deny legitimate users while trying too hard to block zombies. Remember that each fetch of a page element like an icon image also counts as one connection, although HTTP/1.1 may reuse a connection.