Author Topic: Web sites failures  (Read 7044 times)

0 Members and 1 Guest are viewing this topic.

Offline saratogaWX

  • Administrator
  • Forecaster
  • *****
  • Posts: 9297
  • Saratoga, CA, USA Weather - free PHP scripts
    • Saratoga-Weather.org
Re: Web sites failures
« Reply #50 on: July 29, 2016, 11:43:27 PM »
Too much stuff all at once, I think.

I'd try the simpler (but also effective)

Code: [Select]
SetEnvIfNoCase Referer "^qq829" TOBLOCK=1
SetEnvIfNoCase Referer "^cnzz" TOBLOCK=1
SetEnvIfNoCase ^User-Agent$ .*80legs.* TOBLOCK=1
SetEnvIfNoCase ^User-Agent$ .*Ezooms.* TOBLOCK=1
SetEnvIfNoCase ^User-Agent$ .*Ahrefs.* TOBLOCK=1
SetEnvIfNoCase ^User-Agent$ .*package.* TOBLOCK=1

<FilesMatch "(.*)">
Order Allow,Deny
Allow from all
Deny from env=TOBLOCK
</FilesMatch>

That should get the really bad bots off your site.

I was curious as to why
Code: [Select]
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?komokaweather.com/.*$ [NC]
RewriteRule \.(gif|jpg|png|js|css)$ - [F]
was in there and what was it's purpose?

Ken True/Saratoga, CA, USA main site: saratoga-weather.org
Davis VP1+ FARS, Blitzortung RED, GRLevel3, WD, WL, VWS, Cumulus, Meteobridge
Free weather PHP scripts/website templates - update notifications on Twitter saratogaWXPHP

Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #51 on: July 29, 2016, 11:44:35 PM »
Hi John,
I uploaded with removing that part but that gave an Internal Server Error, so have removed the .htaccess from komokaweather.ca and now the page is basically back to standard!

I still have your original .htaccess in the komokaweather.com root folder

Regards,
Paul

Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #52 on: July 29, 2016, 11:49:02 PM »
 =D&gt; \:D/ I've uploaded Ken's code as htaccess and www.komokaweather.ca

Many thanks again,
Paul

Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #53 on: July 30, 2016, 12:01:55 AM »
Just to update:
Saratoga template working as expected
Leuven template working as expected
My Cumulus www.komokaweather.com site working.  I may begin to add some of the removed items back in, one at a time tomorrow.
I will try the cron jobs for pws tomorrow as that is all that appears to be missing there.
Then to the j-template Meteotemplate - cron job and looking at the failed blocks...

Haven't determined what caused the issues over the past days, and by adding in the removed items on the .com may lead to an answer, or GoDaddy has taken me out of the penalty box :roll:

Thanks for your assistance, and looking to a good night's sleep.

Paul

Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #54 on: July 30, 2016, 12:22:32 AM »
OK, final post for tonight.

Just checked my GoDaddy error log and see 11 client denied by server configuration: /var/chroot/home/content/96/5379896/html/komokaweather-ca/wxwuhistory.php in less than 1 minute from the French site IP 164.132.161.xx like;


164.132.161.54 - Geo Information
IP Address164.132.161.54
Hosthydrogen152.a.ahrefs.com
LocationFR FR, France
City-, 
OrganizationWind Telecomunicazioni
ISPWind Telecomunicazioni



Offline saratogaWX

  • Administrator
  • Forecaster
  • *****
  • Posts: 9297
  • Saratoga, CA, USA Weather - free PHP scripts
    • Saratoga-Weather.org
Re: Web sites failures
« Reply #55 on: July 30, 2016, 12:39:25 AM »
Yep.. the bad-bot Ahrefs doing an infinite walk through the wxwuhistory.php links ever back in time.  Each access to a new date causes a fetch from weatherundergound.com for the data (causing some congestion), and so it repeats.  Blocking the misbehaving in .htaccess is the only way to stop the abuse of your site.

Glad it's working out now.  It's not something you did, it's likely that the excess traffic to that page by the bad bot was causing the congestion for all your sites as you reached your account limit on concurrent PHP sessions (and maybe a limit on number of outbound connections too.).  Things should settle down now.   Keep an eye on your logs for other multi-accesses to the wxwuhistory.php page for other non-throttled web crawlers.
Ken True/Saratoga, CA, USA main site: saratoga-weather.org
Davis VP1+ FARS, Blitzortung RED, GRLevel3, WD, WL, VWS, Cumulus, Meteobridge
Free weather PHP scripts/website templates - update notifications on Twitter saratogaWXPHP

Offline Maumelle Weather

  • Forecaster
  • *****
  • Posts: 1827
    • Maumelle Weather
Re: Web sites failures
« Reply #56 on: July 30, 2016, 08:02:55 AM »

I was curious as to why
Code: [Select]
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?komokaweather.com/.*$ [NC]
RewriteRule \.(gif|jpg|png|js|css)$ - [F]
was in there and what was it's purpose?

Hi Ken,

I had added that a long time ago (courtesy of Kevin - TNET) when I was uploading GR3 images and found through my logs that several folks were hot linking the radar images and some of the .js scripts.

In January of this year, I had the same issues with bots/miscreants that Paul is having when I started receiving emails about maximum CPU and MySQL query usage. That's where the extensive robots.txt and .htaccess came from. Also, I now use ZB Block, in addition to the other files and have effectively stopped the bots completely. The basic program hasn't been updated in a couple of years, however, some members of the ZB Block forum have taken up the task of keeping the signature files updated, including country wide IP blocks that are updated at least monthly, if not weekly.

I'm attaching what the killed_log looks like. It really is an awesome script for what it does.

John
GR2AE, GR3, Cumulus

Offline mcrossley

  • Forecaster
  • *****
  • Posts: 1140
    • Wilmslow Astro
Re: Web sites failures
« Reply #57 on: July 30, 2016, 09:43:52 AM »
+ for zbblock from me.
Mark

Offline saratogaWX

  • Administrator
  • Forecaster
  • *****
  • Posts: 9297
  • Saratoga, CA, USA Weather - free PHP scripts
    • Saratoga-Weather.org
Re: Web sites failures
« Reply #58 on: July 30, 2016, 11:04:47 AM »
Ahh, yes, I remember Kevin's anti-hotlink mod for images .. thanks for reminding me.

In Paul's case, due to the setup referencing within the www.komokaweather.ca/www.komokaweather.com, it was disallowing .css,.gif,.png with 403s and now that it's removed, all the parts are flowing now on komokaweather.ca.

Re ZBBlock, I've looked at it and was impressed with the extensive rules set.  I've opted to keep-it-simple (for me) and do daily log analysis (bunch of perl scripts) and take .htaccess mods as needed to block if required.  Since my background includes internet security, I do like to see directly (in the logs) what the miscreants are slinging.  Since I only have basic PHP with sanitized user inputs and no mySQL queries via the site, I'm not very worried about getting compromized from HTTP traffic, but I am vigilant.
Ken True/Saratoga, CA, USA main site: saratoga-weather.org
Davis VP1+ FARS, Blitzortung RED, GRLevel3, WD, WL, VWS, Cumulus, Meteobridge
Free weather PHP scripts/website templates - update notifications on Twitter saratogaWXPHP

Offline Jáchym

  • Meteotemplate Developer
  • Forecaster
  • *****
  • Posts: 8605
    • Meteotemplate
Re: Web sites failures
« Reply #59 on: July 30, 2016, 11:11:00 AM »
Protecting Meteotemplate against SQL injections was pretty simple too.

Inserting anything to any table can only be done by either the update script (requires update password) or by the admin (requires login via admin password).

There are pages where user input is included in the query, but due to the nature of the pages, it was very easy to sanitize this. The only time a user input is taken into account is when choosing for example which year, month etc. they want to see.

So for those cases I can simply check the input and make sure it is a number within certain range and if some hacker tried to send an injection it would simply be rejected.

EG:
in some reports, the user chooses a year and gets results from the database.
In such case the year is sent as GET parameter, so all I had to do is make a simple check that the user input is a number between 1990 and 2100 for example and anything different from that would result in "die()".

Likewise for months I can simply check number equal to 1 to 12 etc.

Offline saratogaWX

  • Administrator
  • Forecaster
  • *****
  • Posts: 9297
  • Saratoga, CA, USA Weather - free PHP scripts
    • Saratoga-Weather.org
Re: Web sites failures
« Reply #60 on: July 30, 2016, 11:50:56 AM »
That's great, Jachym!   8-)  =D&gt;

That is the proper way to protect your system from the miscreants who will try to jam anything into an URL to see if it can get them in.

Thanks for your thoughtfulness and preparations to keep your users safe.

Best regards,
Ken
Ken True/Saratoga, CA, USA main site: saratoga-weather.org
Davis VP1+ FARS, Blitzortung RED, GRLevel3, WD, WL, VWS, Cumulus, Meteobridge
Free weather PHP scripts/website templates - update notifications on Twitter saratogaWXPHP

Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #61 on: July 30, 2016, 02:11:22 PM »
Hi Wim,
Quote
What we need to do is remove that page from menu wsMenuData.xml
Also we need to update the WU pages to use https.

I have re-added that page back in wsMenuData.xml.  Also updated with the 2016-06-05 WU Scripts, and also updated all the other updates I think.  Is there a quick way to check that I have the most current versions of files/scripts?

Thanks for your help, and the systematic approach you suggested is what I needed to do, and then better understand all the background webhost/website processes, etc.

Regards,
Paul


Offline PaulMy

  • Forecaster
  • *****
  • Posts: 5519
    • KomokaWeather
Re: Web sites failures
« Reply #62 on: July 30, 2016, 11:42:49 PM »
Should I be concerned about periodic visit attempts and these errors?
Quote
[Sat Jul 30 19:18:04 2016] [5379896] [fcgid:warn] (32)Broken pipe: [client 134.249.65.218:63816] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://postovoi.com/
[Sat Jul 30 19:18:24 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 134.249.65.218:51034] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://postovoi.com/

[Sat Jul 30 04:28:42 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 46.118.156.206:60727] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://pron.pro/
[Sat Jul 30 04:28:42 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 46.118.156.206:52946] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://lipetsk.xrus.org/
[Sat Jul 30 04:28:43 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 46.118.156.206:58161] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://lipetsk.xrus.org/

[Sat Jul 30 18:06:55 2016] [5379896] [fcgid:warn] (32)Broken pipe: [client 134.249.53.10:49602] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://zajm-pod-zalog-nedvizhimosti.ru/
[Sat Jul 30 18:06:55 2016] [5379896] [fcgid:warn] (32)Broken pipe: [client 134.249.53.10:49469] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://confib.ifmo.ru/
[Sat Jul 30 18:07:19 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 134.249.53.10:64303] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://zajm-pod-zalog-nedvizhimosti.ru/
[Sat Jul 30 18:07:19 2016] [5379896] [fcgid:warn] (110)Connection timed out: [client 134.249.53.10:59670] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer http://confib.ifmo.ru/


They all seem to be from the same place in Ukraine.

Paul

Offline Maumelle Weather

  • Forecaster
  • *****
  • Posts: 1827
    • Maumelle Weather
Re: Web sites failures
« Reply #63 on: July 31, 2016, 09:03:27 AM »
Hi Paul,

I have those blocked, in addition to a bunch of others, too. I'm attaching a new .htaccess.txt file showing how I accomplished that. In looking through your logs and you come upon an IP or IPs you don't recognize or they have suspicious behavior to them, this is what I do.

Go to whois.com, and copy the IP into the search box up top next to the green search button. Every IP on the planet is listed here, who its owned by, etc.

As an example, I'll use the following IP:  134.249.65.218

This is from Ukraine, as you have already mentioned. Looking a the page for this IP you will see this:

inetnum:        134.249.64.0 - 134.249.127.255     This is the block of IP's assigned to this carrier.

If you look farther down the page, you will see the following:

route:          134.249.0.0/16      This represents the CIDR (Classless Inter-Domain Routing) for this carrier. I put the link for Wikipedia for an explanation.

Note:  Not all listings will have the "route:" on there, but most do.  If it did not have the route listed, I would normally take the first 2 number sets (e.g. 134.249) and add the (e.g. .0.0/16) to them and add those to the deny from list.

To block the IP(s), please look at the bottom of the .htaccess.txt file and you will see I added the following line:

<Limit GET POST>
order deny,allow
deny from env=bad_bot

deny from 46.118.0.0/16 134.249.0.0/16

allow from all
</Limit>

You can add as many different ones you desire as you run across them, but some words of caution.  Make sure you leave a space between the addresses AND you don't have any extra spaces, periods, etc. in there, otherwise when you upload the new file and check your site(s), you will get a 500 server error. What I do is rename the current .htaccess to .htaccersscur so I don't overwrite the one that works and try the new one.

Another word of caution. If your .htaccess file gets very large, it will slow down the loading time of your site.

The additional addresses you see are the ones I see daily in my logs trying to get in.  The ones beginning with 23.96., 50., 52., 54., belong to Amazon Web Services. The 23.20. belongs to Microsoft, interestingly enough. The 202.46. belongs to China, and so on.

Hope this helps and doesn't confuse instead,

John
GR2AE, GR3, Cumulus

Offline weatherc

  • Senior Contributor
  • ****
  • Posts: 278
Re: Web sites failures
« Reply #64 on: July 31, 2016, 09:25:20 AM »
Here are a quite impressive htaccess bad-bot-blocker for Apache:
https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker

 :grin: