WXforum.net
Web Weather => Weather Website PHP/AJAX scripting => Topic started by: tshattuck on September 13, 2019, 06:20:51 PM
-
Hello:
I noticed today my webpage wxmetar.php will not load and it locks both my browsers (Firefox and Internet Explorer). The web page is part of the php scripts from Ken True.
I have not made any recent changes to this page. Please see attached file.
Don't know why I am having this issue. I did not update NWS Alerts in May,2019 as I am not using Leaflet/OpenStreetMaps. I don't know if this may be impacting the wxmetar.phppage. Does anyone know what the issue here is?
Any assistance is appreciated.
-
Hey,
Interesting, I've got the same issue, also my site has been very slow, to getting 503 Internal Server error for the last 3 days, I've also done nothing to prompt this error, my site is on a GoDaddy Server, apparently GoDaddy have changed all the Urls to another format, something about parent and child locations, according to the help forum page of their site.
Anybody else with this issue, or with GoDaddy on a go slow to non existent site load (503 error)
Thanks.
Nick. dw7240.com
-
Nick:
My site is also on GoDaddy and I am getting the 503 error also. I checked the metarcache files and they haven't updated in two days.
My file permissions are 644 in the cache folder and the cache folder is 755.
I am glad it's not something I did. Hopefully someone from the forum can tell us how to correct.
Regards,
-
Folks
Check this link:
http://www.gigharborweather.com/wxmetar.php (http://www.gigharborweather.com/wxmetar.php)
He is getting error: unable to load KRNT data RC=302 Object Moved
GoDaddy changed something to impact the php files. I believe I have the files in the proper directory. I checked other files my cache directory and they are updating. The metarcache files are the only ones not updating. :? :???: :???:
Thanks
-
http://www.gigharborweather.com/check-fetch-times.php?show=versions (http://www.gigharborweather.com/check-fetch-times.php?show=versions)
Using a lot of old code. In particular, you need 1.17 of get-metar-conditions-inc.php in order to do https fetches.
You are getting the "object moved" redirect because the old code fetches the http version and won't follow the redirect.
-
Jasiu:
Thanks for the feedback.
I ran again the check-fetch-times.php file and verified I am running Version 1.17 of get-metar-conditions-inc.php.
I noticed the problem so far appears with folks who use GoDaddy as their website host.
Hopefully someone from the forum can shed some light on this issue.
Regards,
-
What's the URL of your site?
-
I am with GoDaddy and my http://www.komokaweather.ca/wxmetar.php is not working
Enjoy,
Paul
-
I'm on GD also. Mine loaded in a few seconds.
https://www.bismarckweather.net/wxmetar.php
-
Tom - never mind - it's right there at the bottom of you post. Doh.
-
I was able to get
view-source:http://www.komokaweather.com/komokaweather-ca/wxmetar.php?debug=y (http://view-source:http://www.komokaweather.com/komokaweather-ca/wxmetar.php?debug=y)
to load after a long wait. I'm also trying
view-source:http://yourlocalweather-clay-ny.com/wxmetar.php?debug=y (http://view-source:http://yourlocalweather-clay-ny.com/wxmetar.php?debug=y)
and despite having lunch since I started it, it's still not done.
Anyway, in the former case, the issue is is resolving the NOAA host:
<!-- mtr_conditions using METAR ICAO='CYXU' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/CYXU.TXT' -->
<!-- curl Error: Could not resolve host: tgftp.nws.noaa.gov -->
<!-- HTTP stats: RC=0 dest= port=0 (from sce=)
Times: dns=0.000 conn=0.000 pxfer=0.000 get=0.003 total=0.003 secs -->
<!-- headers returned:
-->
<!-- mtr_conditions returns RC='' for ICAO/METAR='CYXU' -->
Not sure why it is so painfully slow as the CURL stats are showing a quick (non-) response.
Anyway, you need to talk to your provider (GD?) about why the DNS resolution is failing.
-
Just spent 30 minutes on the phone with GD tech support and they offered no help whatsoever. Although unfortunately I do not have the skill level to go toe to toe with the comedian I was talking to. I referenced the suggestion made by Jasiu that the DNS resolution is failing. I was able to get him to do do a view-source and showed him the "<!-- curl Error: Could not resolve host: tgftp.nws.noaa.gov --> He insisted that the problem is with our "custom" scripts and not with them. I am up to date on my scripts: https://www.hsnpar.com/check-fetch-times.php?show=versions (https://www.hsnpar.com/check-fetch-times.php?show=versions) I see that gigharborweather & komokaweather are still having issues as I am sure others are. Anyone having any luck?
-
As a check to make sure this is the problem and to maybe help convince tech support people, you could just plug the IP address into the code. This is NOT a permanent solution, however.
To do that, look for this code in get-metar-conditions-inc.php:
if (isset($icao) and strlen($icao) == 4) {
$Debug.= "<!-- mtr_conditions using METAR ICAO='$icao' -->\n";
$host = 'tgftp.nws.noaa.gov';
$path = '/data/observations/metar/stations/';
$metarURL = 'https://' . $host . $path . $icao . '.TXT';
$html = '';
$raw = '';
Change the assignment of $host to:
// $host = 'tgftp.nws.noaa.gov';
$host = '140.90.101.79';
-
Jasiu... didn't work, however the page loaded quickly, it was taking 60 - 90 secs before.
-
Crud...
<!-- curl fetching 'https://140.90.101.79/data/observations/metar/stations/KHOT.TXT' -->
<!-- curl Error: SSL: no alternative certificate subject name matches target host name '140.90.101.79' -->
<!-- HTTP stats: RC=0 dest=140.90.101.79 port=443 (from sce=50.62.176.52)
Times: dns=0.000 conn=0.056 pxfer=0.000 get=0.196 total=0.196 secs -->
Dang security is too secure.
---
It doesn't look like GoDaddy publishes the names/addresses of its DNS servers publicly but you should be able to find them. Here's the help page:
https://ca.godaddy.com/help/find-my-godaddy-nameservers-12318 (https://ca.godaddy.com/help/find-my-godaddy-nameservers-12318)
I assume you want to go to the cPanel shared hosting link on that page and then follow the instructions to get to the NS (nameservers) section.
The reason I'm asking is if I know their servers' IP addresses, then I can do a query directly to their servers and see what they are returning.
-
Ok, the "warden" has dinner on the table.... I'll check it in a few. btw, thanks for giving us a hand with this.
-
Dinner is important!! [tup]
-
23.229.178.136
-
OK I think I was barking up a wrong tree here because that is the authoritative DNS nameserver for your domain, not where the server is getting its search results. I'm two times zones ahead of you and getting tired. And DNS always gives me a headache. Can pick up on this in the morning if someone else (who knows more) doesn't figure it out.
-
Absolutely.... Thanks
-
For those of us using Ken's "standard" template for our home page this problem is not just the Metar Reports page not displaying correctly, it delays the home page from loading in a timely manner (actually a very long time, if at all) I assume it is because it is using our "local" metar info for the text description of the current conditions displayed just above the temperature, currently displaying on my site as: unable to load KHOT data RC= where it would normally display "Cloudy" or whatever the metar report is at the time. However after making the change to the IP address (for testing, see Jasiu's post above), my site at least loads at it's normal speed, a second or two. I don't know if it is just those of us using GoDaddy for hosting, but so far those of us posting about this problem do have that in common and it hit us all at the same time. I have noticed several sites having this problem, although I have also noticed there are a few loading normally. Odd.
-
I'm using the Saratoga templates and GD. Still working fine here.
-
I guess that removes GoDaddy as the potential culprit. Anyone have any suggestions for those of us that still have broken sites??
-
I guess that removes GoDaddy as the potential culprit. Anyone have any suggestions for those of us that still have broken sites??
Don't think I would rule Go Daddy out yet. Wonder if the sites that aren't working are all on the same GoDaddy server?
-
I have spoken to them twice and they insist it is not their servers.
-
I'm sure they do. I had a similar response from their support about the Hannis radar.
https://www.bismarckweather.net/wxusradars-hanis3.php
The console GD gives you shows that it can resolve the DNS server names and it does on that screen quickly. But from our Script Experts here, we're surmising that that console isn't the console of your actual virtual server your website resides on. We were betting that would show much slower. But of course GoDaddy support denies it. And we have no way to prove it.
So I just live with a slow loading radar page. It is better some days vs. others. But it almost never pops up in seconds.
-
Offline we ran a test that shows that rlee's server isn't succeeding in getting the DNS resolution for the METAR server. I've cleaned up the code a little bit and anyone can try it on their own. Just take the code slice below and stick it in a file called "testdns.php". Upload to your root directory (where index.php is). Then from a browser, just add "/testdns.php" to your site name and it will execute.
Here is the code:
<?php
echo "<p>\n";
echo "Running on " . $_SERVER["SERVER_NAME"] . " (" . $_SERVER["SERVER_ADDR"] . ")\n";
echo "</p>\n";
if (isset($_REQUEST['host']) )
$host = $_REQUEST['host'];
else
$host = "tgftp.nws.noaa.gov";
echo "<p>dns_get_record(\"$host\", DNS_A) returns:</p>\n";
$result = dns_get_record($host, DNS_A);
echo "<pre>\n";
var_dump($result);
echo "</pre>\n";
$hostaddr = gethostbyname($host);
echo "<p>gethostbyname(\"$host\") returns: " . $hostaddr . "</p>\n";
?>
If I run this on my site (https://lexmaweather.info/testdns.php (https://lexmaweather.info/testdns.php)), I get the following output:
Running on lexmaweather.info (74.208.57.239)
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
array(1) {
[0]=>
array(5) {
["host"]=>
string(22) "tgftp.cp.ncep.noaa.gov"
["class"]=>
string(2) "IN"
["ttl"]=>
int(214)
["type"]=>
string(1) "A"
["ip"]=>
string(13) "140.90.101.79"
}
}
gethostbyname("tgftp.nws.noaa.gov") returns: 140.90.101.79
If the DNS fails, dns_get_record() will return false and gethostbyname() just returns the string that is passed in.
Finally, note that you can test DNS resolution of other sites with the parameter "host" added to the url. E.g.
https://lexmaweather.info/testdns.php?host=www.wxforum.com (https://lexmaweather.info/testdns.php?host=www.wxforum.com)
returns:
Running on lexmaweather.info (74.208.57.239)
dns_get_record("www.wxforum.com", DNS_A) returns:
array(1) {
[0]=>
array(5) {
["host"]=>
string(59) "HDRedirect-LB5-1afb6e2973825a56.elb.us-east-1.amazonaws.com"
["class"]=>
string(2) "IN"
["ttl"]=>
int(60)
["type"]=>
string(1) "A"
["ip"]=>
string(12) "23.20.239.12"
}
}
gethostbyname("www.wxforum.com") returns: 23.20.239.12
-
I've never dealt with GoDaddy but with 1&1 I've usually been able to get issues handed off to a higher level when the front line responders can't figure it out. I also always do it via email so I can include reams of info when I have it.
I think it's going to take that sort of effort on GD's part to figure out what is happening. Unless there is something obvious and simple we are all overlooking (would not be the first time for me...).
-
OK, I believe we have a workaround for this until / unless someone can figure out why the DNS resolution is failing.
Only make this change if you see the problem described in this thread.
Edit get-metar-conditions-inc.php. Look for the following code:
curl_setopt($ch, CURLOPT_TIMEOUT, $numberOfSeconds); // data timeout
$data = curl_exec($ch); // execute session
if (curl_error($ch) <> '') { // IF there is an error
$Debug.= "<!-- curl Error: " . curl_error($ch) . " -->\n"; // display error notice
}
Add one line to the top of this so it looks like this:
curl_setopt($ch, CURLOPT_RESOLVE, array("tgftp.nws.noaa.gov:443:140.90.101.79"));
curl_setopt($ch, CURLOPT_TIMEOUT, $numberOfSeconds); // data timeout
$data = curl_exec($ch); // execute session
if (curl_error($ch) <> '') { // IF there is an error
$Debug.= "<!-- curl Error: " . curl_error($ch) . " -->\n"; // display error notice
}
Upload modified file to your server.
This is a hack/kludge that forces the DNS resolution to happen locally.
This particular issue makes almost all pages on a site slow because the software plugin *-defs.php file (e.g., MB-defs.php), which is included on most pages, does a METAR call for the local conditions data.
-
I am experiencing the same issues described and I am also using Go Daddy for hosting. I'll try some of the tests and workarounds posted. I appreciate everyone's efforts in finding out what is causing the issue. It's strange how some are not affected.
-
Eric... Jasiu's work around worked for me. If you look at the time stamps of his post he spent quite a bit of time on this, my hat is off to him!
-
Could the issue be Cumulus related? The stations that are reporting issues all seem to be running Cumulus. The stations reporting no issues are running Weather Display. I am going to try Jaisu's workarounds. Thank's for your efforts Jasiu!
-
I implemented the workaround as described by Jasiu and my METAR information is now loading. My National and State Extremes in the menubar are still not working. Same DNS issue maybe?
https://www.camarilloweather.com/index.php
-
Thanks to Jasiu for the bypass fix shown above. The root cause of the issue for GoDaddy folks is a DNS lookup failure by the GoDaddy caching DNS servers for tgftp.nws.noaa.gov. You have to ask their tech support to
1) login to the Webserver using SSH
2) try using nslookup tgftp.nws.noaa.gov
and see/fix the error messages shown.
They can use the contents of /etc/resolv.conf to figure out which nameserver(s) are having the issues. It is THEIR problem to fix, and Jasiu’s bypass is a good temporary workaround.
I am currently on a cruise ship between Bergen, Norway and Torshavn, Faroe Islands on our way to Montreal, Canada on the 30th. So.. responses to issues are going to be quite limited by me as I’m away from my normal debugging tools (writing this on an iPad using ship’s internet).
BTW.. in Bergen, Norway we’ve experienced the remnants of Dorian causing heavy rain/wind and high seas. What goes up the east coast of the USA ends up in Europe, thanks to following the Gulf Stream.
-
I implemented the workaround as described by Jasiu and my METAR information is now loading. My National and State Extremes in the menubar are still not working. Same DNS issue maybe?
https://www.camarilloweather.com/index.php
Good chance of that since it loads from www.cpc.ncep.noaa.gov (same server farm).
I should have time to look at this later if no one else gets a chance. I believe the solution Ken details above should fix both but if someone contacts GoDaddy, mention both domain names.
-
I have made Jasiu's mod in get-metar-conditions-inc.php and now the page loads, but all painfully slow.
Thanks for your diligence.
Enjoy,
Paul
-
Hey Eric,
Yeah, same thing. Similar fix. go into usaextremes.php, find the function curl_get_contents(), and add this line just before the curl_exec() call:
curl_setopt($ch, CURLOPT_RESOLVE, array("www.cpc.ncep.noaa.gov:443:140.90.101.79"));
Note that this is the same IP address as the other case. Both resolve to the same server.
-
Paul,
It looks like the issue in your case is fetching a bunch of images when the caches are invalid. If the caches are good and there are no refetches, it loads pretty quick for me.
Here is the view-source of the files in question.
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_50.GIF
to radar-NAT-0.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_50.GIF in 0.465 secs -->
<!-- reloaded radar-NAT-0.png in 0.974 secs. (Mon, 16 Sep 2019 14:50:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_40.GIF
to radar-NAT-1.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_40.GIF in 0.477 secs -->
<!-- reloaded radar-NAT-1.png in 0.999 secs. (Mon, 16 Sep 2019 14:40:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_30.GIF
to radar-NAT-2.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_30.GIF in 0.508 secs -->
<!-- reloaded radar-NAT-2.png in 0.952 secs. (Mon, 16 Sep 2019 14:30:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_20.GIF
to radar-NAT-3.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_20.GIF in 0.508 secs -->
<!-- reloaded radar-NAT-3.png in 1.028 secs. (Mon, 16 Sep 2019 14:20:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_10.GIF
to radar-NAT-4.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_10.GIF in 0.557 secs -->
<!-- reloaded radar-NAT-4.png in 1.086 secs. (Mon, 16 Sep 2019 14:10:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_00.GIF
to radar-NAT-5.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_14_00.GIF in 0.563 secs -->
<!-- reloaded radar-NAT-5.png in 1.095 secs. (Mon, 16 Sep 2019 14:00:00 UTC) -->
<!-- Loading 14-Color https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_13_50.GIF
to radar-NAT-6.png
dir=/home/psoykkrhjuz3/public_html/komokaweather-ca/radar/ -->
<!-- fetch https://weather.gc.ca/data/radar/temp_image/COMPOSITE_NAT/COMPOSITE_NAT_PRECIP_RAIN_2019_09_16_13_50.GIF in 0.514 secs -->
<!-- reloaded radar-NAT-6.png in 1.060 secs. (Mon, 16 Sep 2019 13:50:00 UTC) -->
<!-- small image w=300 h=261 saved to radar-NAT-0-sm.png in 0.046 secs. -->
<!-- image files cached in 7.000 secs. -->
I've seen this take up to 12 seconds. You might want to think about a cron job to fetch the files so that the caches are always fresh.
-
I’m ya having the same issue load speeds slow and meter not working for BMG
-
1) Anyone gotten anywhere with GoDaddy?
2) Has everyone effected put in the temporary hack to get around the problem?
-
Hi,
I've had little to no response from GoDaddy, except their quick to blame the scripts that we use, yet their own help forum suggests changes they have made, read my answer above (second post). Clearly this is not something we have done, pretty sure about that, and I've applied the fixes, still very slow loads, and yes......…..no metar reports, so I guess its a bit like flogging a dead horse, and I think as time goes on this problem will re-accure in the future as PHP changes, and other scripts that are hosts use get updated. We are a small minority fighting for a usable server space, guess our voice is not big enough, hence the negativity from GoDaddy.
Nick. dw7240.com.
-
Hey Nick,
Scratching my head. You have the right code (http://dw7240.com/Base-Canada/get-metar-conditions-inc.php?sce=view (http://dw7240.com/Base-Canada/get-metar-conditions-inc.php?sce=view))
curl_setopt($ch, CURLOPT_RESOLVE, array("tgftp.nws.noaa.gov:443:140.90.101.79"));
curl_setopt($ch, CURLOPT_TIMEOUT, $numberOfSeconds); // data timeout
$data = curl_exec($ch); // execute session
if (curl_error($ch) <> '') { // IF there is an error
$Debug.= "<!-- curl Error: " . curl_error($ch) . " -->\n"; // display error notice
}
However... (view-source:http://dw7240.com/Base-Canada/index.php?debug=y (http://view-source:http://dw7240.com/Base-Canada/index.php?debug=y))
<!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='CYOO' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/CYOO.TXT' -->
<!-- curl Error: Resolving timed out after 6000 milliseconds -->
<!-- HTTP stats: RC=0
Times: dns=0.000 conn=0.000 pxfer=0.000 get=6.001 total=6.001 secs -->
<!-- headers returned:
-->
I found an instance of someone getting the same error and fixing it by adding:
curl_setopt($ch, CURLOPT_DNS_USE_GLOBAL_CACHE, false);
(https://stackoverflow.com/questions/36434049/php-curl-curlopt-resolve-not-working (https://stackoverflow.com/questions/36434049/php-curl-curlopt-resolve-not-working))
So you could try that. Goes with the other options before the curl_exec() call.
-
1) Anyone gotten anywhere with GoDaddy?
2) Has everyone effected put in the temporary hack to get around the problem?
I have godaddy and i run testdns.php and get this
Running on www.gosportwx.com (160.153.94.162)
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
bool(false)
gethostbyname("tgftp.nws.noaa.gov") returns: 140.90.101.79
I too have done the workaround and it appears to be working now, except for my CloudLevel . Seems like my website still loads slow as well :(
-
Hello All......
I too have been having this no metar data issue since around september 9th and was not getting any metar data at all on my homepage dashboard near top of thermometer for Text words of the Current conditions and also on my nearby metar reports page. I wasnt aware that this many other people are also having same issue till now. I have called godaddy about 4 times and they always tell me it is not their problem or their servers. Ialos know of 3 other sites not mentioned in this thread that are having same issue. and my USA Extremes script IS also not working so I removed the code temporarily from my menubar.php file so it doesnt display at all for now.
These 2 issues were making my web pages load super slow and it is very annoying. I also have disabled my metar data from doing anything by going into my settings-weather.php file and uncommentting the lines like this with the 2 .... "//" 's at begining of line to disable it.
//$SITE['conditionsMETAR'] = 'KTAN'; // set to nearby METAR for current conditions icon/text
one thing I have noticed in past 2 day (september 17,18th) is that it is now sometimes working again without any workarounds, so tonight I have reenabled the metar data to fetch again in my settings-weather.php file
as of 8pm EST tonight (september 18th) I am getting Metar data. Hopefully it will continue on working and maybe godaddy fixed the DNS issues Ken Mentioned.
Big Thanks to Jasiu and Ken for their help and efforts and everyone else for their input. I have been with godaddy for my Domain name only since 2007 and then when E-rice webhost vanished...I went to godaddy for my webhosting at least 5 years ago I believe and havent had many issues with them till this chain of events started Happening.
Special Message to Jasiu......just noticed you are also Located in Taxachusetts like me and wondering when did you first put your site online? I checked it out tonight as this is the first time of realizing someone from that area was also involved in the weather website world. on my left side bar I have external links to others in MASS if you want to ever check them out.
www.gateway2capecod.com (West Wareham, MA)..About 10 miles away from Cape Cod Canal.
......Chris
-
Special Message to Jasiu......just noticed you are also Located in Taxachusetts like me and wondering when did you first put your site online? I checked it out tonight as this is the first time of realizing someone from that area was also involved in the weather website world. on my left side bar I have external links to others in MASS if you want to ever check them out.
www.gateway2capecod.com (West Wareham, MA)..About 10 miles away from Cape Cod Canal.
......Chris
Original site with the Saratoga template went up late July 2015. I redid the UI into its present form (evolving slowly since then) in June of 2016. There is also another site in Weston that uses my code (http://himead.com) but I'll note that I'm NOT getting into the distribution and support business. :grin:
-
Weird, Chris... I'm seeing inconsistent results for your site. Sometimes the DNS resolution goes fine and everything loads. Other times I still see the "curl Error: Could not resolve host: tgftp.nws.noaa.gov" message.
edit: put a comma between "weird" and "Chris" so you didn't think I was calling you "Weird Chris". :grin:
-
Anyway to fix cloudbaseCU.php from loading slow? I know the meter is causing it just dunno how to fix it within the script.
-
I don't have / use that code. If you want to DM me the source (you'll need to rename it a txt file to do that) I can have a look.
-
hello again....
yes Jasiu sometimes it loads and sometimes not. it was working most of last night and this morning and the crapped out sometime shortly after and now at 830pm EST still not loading.
so I called the idots at Godaddy and told them to do this as per Kens instructions:
Thanks to Jasiu for the bypass fix shown above. The root cause of the issue for GoDaddy folks is a DNS lookup failure by the GoDaddy caching DNS servers for tgftp.nws.noaa.gov. You have to ask their tech support to
1) login to the Webserver using SSH
2) try using
Code: [Select]
nslookup tgftp.nws.noaa.gov
and see/fix the error messages shown.
They told me I would need my developer of the Files (Ken) to login to my server using SSH and that they could not do it. They just maintain..or as She said... make sure my shared webhosting site is loading, which it is, but they cant fix specific things like one file not being able to go out and fetch data. They are obviously f**kING MORONS and Im seriously thinking Of getting web hosting from some other Place in the near future.
I told the Stupid Bitch on the phone that I could take all my files and put them on my buddy's server that is on a different host and the site loads fine including the metar data stuff but as of september 9th something is not allowing the metar data to work on their servers.
at the same time my metar Data stopped working so did my USA extreme's script that was originally on my left side bar, but I have since removed since it doesnt load the data.
so Pissed at those Assholes @#$#@!!!
-
Hey Eric,
Yeah, same thing. Similar fix. go into usaextremes.php, find the function curl_get_contents(), and add this line just before the curl_exec() call:
curl_setopt($ch, CURLOPT_RESOLVE, array("www.cpc.ncep.noaa.gov:443:140.90.101.79"));
Note that this is the same IP address as the other case. Both resolve to the same server.
Thanks Jaisu. Currently things seem to be working without the temporary fix but I'll use this if the problem returns. I have noticed that the issue has been coming and going the past few days.
-
Hello....
well I am presently using the work around that Jasiu created for my Metar data and seems to be working and Loading Real Good. Thanks for the work around Jasiu.
as for Godaddy.......UUUGGGHHHHHH!........Lol
Thanks....Chris
-
I had an interesting thing happen in reference to this issue. I received an email from GoDaddy asking me for feedback from my recent "experience" with their tech support. The ratings were from 1 to 10 with 10 being the best. I gave every single category a one and fortunately there was also a "comment" section. I lit it up!! Letting them know how unprofessional, rude and useless their tech support is. Now comes the interesting part. The following morning I received an unsolicited text message wanting to discus my review. This is what it said: "Hello this is Ivette with GoDaddy. I wanted to let you know I'm reviewing your recent customer support survey feedback. My team is available to help if you have unresolved issues or would like to discuss further. If you aren't into texting, no hard feelings! Just reply stop"There are 30 messages in the conversation. I even referenced this thread during the conversation for them to read. I am waiting for Ken to return from his well need and deserved vacation and I am going to send him the entire conversation. I seem to have been able to have "escalated" this issue to a level where it can be discussed with folks outside the phone techs. I am glad it was done by SMS so I have a record of the conversation. I am very interested to hear Ken's thoughts.
updated 4:53pm.. btw, even after detailing Ken's response, word for word, they still refuse to acknowledge that they have a problem. They insist it is our "custom" scripts. I can promise you this, it is far from over. I'm 65 and have mellowed with age and don't let the "small" things bother me. Well, I am HOT!!
-
I just tried https://www.hsnpar.com/testdns.php (https://www.hsnpar.com/testdns.php) and got a good result. Seems the issue is coming and going. Often incoming requests on high-volume server farms are load-balanced between multiple physical servers in order to improve response time (worked on a product that did just that a couple decades ago). Just a total guess but maybe that is what goes on here and it depends on which actual piece of hardware processes the request.
-
Could that have anything to do with your fix that I applied?
update: I ran it this morning Sat. 9/21 and received:
bool(false)
Host address is tgftp.nws.noaa.gov
-
The fix/kludge that is in only affects the curl call in that file. Any other resolution requests still go through the normal process.
-
Thanks to Jasiu for the bypass fix shown above. The root cause of the issue for GoDaddy folks is a DNS lookup failure by the GoDaddy caching DNS servers for tgftp.nws.noaa.gov. You have to ask their tech support to
1) login to the Webserver using SSH
2) try using nslookup tgftp.nws.noaa.gov
and see/fix the error messages shown.
They can use the contents of /etc/resolv.conf to figure out which nameserver(s) are having the issues. It is THEIR problem to fix, and Jasiu’s bypass is a good temporary workaround.
I am currently on a cruise ship between Bergen, Norway and Torshavn, Faroe Islands on our way to Montreal, Canada on the 30th. So.. responses to issues are going to be quite limited by me as I’m away from my normal debugging tools (writing this on an iPad using ship’s internet).
BTW.. in Bergen, Norway we’ve experienced the remnants of Dorian causing heavy rain/wind and high seas. What goes up the east coast of the USA ends up in Europe, thanks to following the Gulf Stream.
I was able to login using SSH, this is what I saw.
[/]$ nslookup tgftp.nws.noaa.gov
;; Got SERVFAIL reply from 72.167.234.214, trying next server
;; connection timed out; trying next origin
;; Got SERVFAIL reply from 72.167.234.214, trying next server
;; connection timed out; no servers could be reached
[/]$ more /etc/resolv.conf
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
As noted, the problem seems to be intermittent. I'll keep an eye out and see if I can run the commands again when DNS lookup is working. Then I'll contact GoDaddy with the results and see if I can get them to take a look.
-
I don't have / use that code. If you want to DM me the source (you'll need to rename it a txt file to do that) I can have a look.
Hey Jasiu here is my cloudbaseCU.php as txt. Sorry it took so long been busy. Hope you or someone can help me with the METAR on this one.
Mike
-
Mike - unfortunately that code uses the simple PHP file() function rather than the curl system used by Ken's METAR code to fetch the METAR data from NOAA. I don't think you can override the local DNS decision with the file() call (at least I don't know how to do it).
In order to use the same workaround, someone would have to modify your code to use curl rather than file(). Or you'll just have to wait until the GoDaddy problem gets fixed.
Camarillo - great getting someone to SSH into a GoDaddy server. Will be watching your results.
-
Mike - unfortunately that code uses the simple PHP file() function rather than the curl system used by Ken's METAR code to fetch the METAR data from NOAA. I don't think you can override the local DNS decision with the file() call (at least I don't know how to do it).
In order to use the same workaround, someone would have to modify your code to use curl rather than file(). Or you'll just have to wait until the GoDaddy problem gets fixed.
Camarillo - great getting someone to SSH into a GoDaddy server. Will be watching your results.
Well that sucks wish godaddy wasn’t stupid and could fix the issue without workarounds lol thanks for looking at it.
-
is there away to make the get-metar-conditions to use Cumulus (Zambretti Forecast) data ONLY? I set my cloudbasecu.php getmetar to false and it is showing correctly that it is mostly cloudy near my location instead of clear.
edit i just had to delete the metar cache for the metar station near me it now show correctly i just hope this doesnt have to happen all the time to get it to work.. i'll keep an eye on it.
-
Hey,
Don't want to speak too soon, but something has happened at GoDaddy it looks like, my site now loads as normal, with normal Metar etc. I now wondering if I should go back to the previous scripts before this issue, I'll give it 24 hours, then go back to see if things work, interesting that my Sager forecast is back working at the same time.
Awesome...…..Huge thanks to everyone here who contributed information and those with the knowledge to get those scripts working in our favour.
Thank You !!!!!!!!
Nick. dw7240.com
-
interesting that my Sager forecast is back working at the same time.
BTJ Sager forecaster uses METAR look-up for cloud conditions.
-
I was able to login using SSH, this is what I saw.
[/]$ nslookup tgftp.nws.noaa.gov
;; Got SERVFAIL reply from 72.167.234.214, trying next server
;; connection timed out; trying next origin
;; Got SERVFAIL reply from 72.167.234.214, trying next server
;; connection timed out; no servers could be reached
[/]$ more /etc/resolv.conf
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
As noted, the problem seems to be intermittent. I'll keep an eye out and see if I can run the commands again when DNS lookup is working. Then I'll contact GoDaddy with the results and see if I can get them to take a look.
As noted by Nick, things appear to be working again for now. I logged in again with SSH and reran the commands as suggested by Ken. This is what I saw:
[/]$ nslookup tgftp.nws.noaa.gov
Server: 72.167.234.213
Address: 72.167.234.213#53
Non-authoritative answer:
tgftp.nws.noaa.gov canonical name = tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov canonical name = tgftp.cp.ncep.noaa.gov.
Name: tgftp.cp.ncep.noaa.gov
Address: 140.90.101.79
[/]$ more /etc/resolv.conf
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
I am pushing the envelope here with my troubleshooting skills so bear with me, It looks to me like maybe when it wasn't working it tried using 72.167.234.214 but it was failed; it then tried 72.167.234.213 but it did not respond. When it was working 72.167.234.213 was the server that responded and it worked. Anyone out there with more knowledge then me that can confirm that or tell what's really going on?
Eric
-
Note that you can specify the server after the host name in the nslookup command. E.g.,
nslookup tgftp.nws.noaa.gov 72.167.234.214
-
Note that you can specify the server after the host name in the nslookup command. E.g.,
nslookup tgftp.nws.noaa.gov 72.167.234.214
I did not know that, thanks. Here are the results:
[/]$ nslookup tgftp.nws.noaa.gov 72.167.234.213
Server: 72.167.234.213
Address: 72.167.234.213#53
Non-authoritative answer:
tgftp.nws.noaa.gov canonical name = tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov canonical name = tgftp.cp.ncep.noaa.gov.
Name: tgftp.cp.ncep.noaa.gov
Address: 140.90.101.79
[/]$ nslookup tgftp.nws.noaa.gov 72.167.234.214
Server: 72.167.234.214
Address: 72.167.234.214#53
Non-authoritative answer:
tgftp.nws.noaa.gov canonical name = tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov canonical name = tgftp.cp.ncep.noaa.gov.
Name: tgftp.cp.ncep.noaa.gov
Address: 140.90.101.79
Looks like both servers are working at the moment. I'll try it again if the problem returns.
Eric
-
It looks like to GoDaddy resolution problem is back. I've seen it on two sites now.
-
Folks:
WOW! :eek:
I didn't realize when I posted the question on Sept 13th regarding the METAR web page that there was such a huge underlying problem. I agree Customer Service at GoDaddy leaves something to be desired. I've had problems in the past with their Servers which is a pain to get addressed and usually takes multiple phone calls to fix.
I decided to simplify my web site by using NWS Product: RWR. Please see attached link.
It loads very fast and the coding is simpler. The down sides are; the new web page is all text and not all NWS local offices create product RWR.
http://yourlocalweather-clay-ny.com/wxforecast-regional-weather-roundup.php (http://yourlocalweather-clay-ny.com/wxforecast-regional-weather-roundup.php)
Thanks to all for your time and efforts
-
Well...…..,as feared...…..I Spoke too soon, I come home from work and yes a slow site load again, also no metar reports, just as before, this problem seems to be a tricky one to point to a clear culprit, I'm beginning to think that GoDaddy are not the issue, that the main problem seems to stem from certain NOAA servers.
If your asking why I think this, it's because I had an issue with the NOAA radio broadcasts also, I retransmit the Toronto (XMJ225. 162.4 MHz) Weather Radio Broadcast, and funny enough this issue started around the same time, during the broadcast it would come across with no Forecasts being available from the list of forecasts available from the region, which was weird to listen to, and never gave it much thought as to whether this issue could be linked. Not knowing how NOAA gathers it's forecasts I'm sure that all the forecast information crosses over with each form of broadast, whether it be metar forecasts for the airports etc, or whether it's on their weather radio broadcast, and that data comes to an end point, namely a server which carries that information to feed the end points as mentioned. I could be barking up the wrong tree, but the radio broadcasts after a couple of weeks seem to be normal as before the issues.
Anybody with thoughts on this I'd be happy to read about, I'm not a stupid person but would welcome the extra knowledge to help in the future.
THanks.
Nick. dw7240.com
-
Here's why it can't be the actual NOAA servers...
Because no requests actually go to the NOAA servers. It's in the "how to I get to NOAA" part of the process that it breaks.
A loose analogy... DNS on the Internet is somewhat like the contacts we have on our phones. Most of us are old enough to remember when you had to memorize phone numbers or at least have the ones you used most written down somewhere because you had to manually enter the number. Now, if I need to call Jane Smith, I just tap on "Jane Smith" and magic happens. I don't need to actually interact with the number.
Similarly, DNS maps domain and sub-domain names to IP addresses, and it is a global system so that anyone can reach any valid domain. But to get to any particular domain, when you get to what's transmitted on the wire/fiber/radio, you need its IP address. Without it, it would be like your phone trying to call Jane Smith when the phone number is blank.
That is what is happening in this case. The METAR code is saying to the local server "I want to connect to tgftp.nws.noaa.gov" and the GoDaddy server says back, "Huh. Never heard of the guy. Sorry". So there is never a single data request to NOAA because there is no way to contact their server.
If this were a problem outside of GoDaddy, others would see the issue also. I've seen no reports with other providers.
-
Hey,
Awesome response, thank you for the explanation, that gives me a better insight as to what is going on, also backs up those who said they have no issues with other providers, but I've also heard that even some GoDaddy users are having no issues, so that tells me its one particular server, if so then there shouldn't be an issue finding that server by GoDaddy Techs. Like I've said before, we weather enthusiasts are a small bunch in comparison to other users globally. Bigger voice normally amounts to required action, but now a days with everything coming down to money, I'm surprised GoDaddy aren't listening, there's a lot of competition out there from other server providers, some cheaper, maybe less reliable, but they exist for people like us. Sometimes money doesn't always do the talking, it's about customer satisfaction, I guess that means nothing now, and look at the result !
Again thanks for the explanation, can only hope GoDaddy can do the right thing, but I think we're all in for a long ride.
Nick. dw7240.com.
-
Thanks Jasiu!
-
I just spent 2 -1/2 hours on chat with GoDaddy tech support and I got absolutely nowhere. I am conceding defeat. Anyone have a recommendation for another host provider that isn't experiencing this issue?
-
And the temp fix has stopped working, seems odd. About to do that myself, did you ask them to ssh into you service and ping the address?
The server I'm on doesn't seem to recognize CURLOPT_RESOLVE as valid, that would make it a very old version of PHP.
-
And the temp fix has stopped working, seems odd. About to do that myself, did you ask them to ssh into you service and ping the address?
Odd that the fix stopped working for you, still looks good here. I asked two different techs to run nslookup and observe the failure. I sent them screen shots of me doing the same with the resulting failure. I ran nslookup from my PC and it worked. I ran nslookup from an online nslookup site and it worked. I explained the issue every which way I could. I got nowhere.
You can set your version of PHP in the control panel. I am at 7.0.33, any further than that and I start to get errors on some of the pages.
-
I get this with the "fix" now.
Warning: curl_setopt() expects parameter 2 to be long, string given in /home/content/80/7511980/html/crowderfarm/get-metar-conditions-inc.php on line 316
using
curl_setopt($ch, CURLOPT_RESOLVE, array("tgftp.nws.noaa.gov:443:140.90.101.79"));
curl_setopt($ch, CURLOPT_TIMEOUT, $numberOfSeconds); // data timeout
$data = curl_exec($ch); // execute session
if (curl_error($ch) <> '') { // IF there is an error
$Debug.= "<!-- curl Error: " . curl_error($ch) . " -->\n"; // display error notice
}
I've changed the update time to 84600 seconds until I can get this figured out. Probably need to call during normal business hours to get someone that knows anything.
-
Yeah, highest I can go to is 5.6.
-
I get this with the "fix" now.
Warning: curl_setopt() expects parameter 2 to be long, string given in /home/content/80/7511980/html/crowderfarm/get-metar-conditions-inc.php on line 316
That says the version of PHP you are running doesn't recognize CURLOPT_TIMEOUT. Unrecognized, it will interpret it as a string rather than an integer/long.
How is it that you can't run on a 7.x.x version? Version 5 isn't supported anymore and any provider that forces you to use it should be left immediately (IMHO).
-
on the phone with them now.
You are right, but I then have to migrate my database and it's an old site.
-
See https://www.php.net/supported-versions.php (https://www.php.net/supported-versions.php).
[ You are not allowed to view attachments ]
-
You can set your version of PHP in the control panel. I am at 7.0.33, any further than that and I start to get errors on some of the pages.
I'm going to guess that what you are seeing are arrays initialized as blank strings. E.g., something like this:
$reallyAnArray = '':
.
.
.
.
$reallyAnArray[$x] = 1;
and the error comes on that last line.
If so, the way to fix those is to change the initialization to:
$reallyAnArray = array();
Ken fixed all of these in his scripts a while back. So if you are up-to-date, you should only see those in code from elsewhere. Jerry's (gwwilk) scripts are all clean too as far as I know.
-
You can set your version of PHP in the control panel. I am at 7.0.33, any further than that and I start to get errors on some of the pages.
I'm going to guess that what you are seeing are arrays initialized as blank strings. E.g., something like this:
$reallyAnArray = '':
.
.
.
.
$reallyAnArray[$x] = 1;
and the error comes on that last line.
If so, the way to fix those is to change the initialization to:
$reallyAnArray = array();
Ken fixed all of these in his scripts a while back. So if you are up-to-date, you should only see those in code from elsewhere. Jerry's (gwwilk) scripts are all clean too as far as I know.
At version 7.3, I get this error message at the top of the index page:
Warning: Use of undefined constant truee - assumed 'truee' (this will throw an Error in a future version of PHP) in /home/public_html/nws-alerts-config.php on line 58
I have not done any problem solving.
-
I have a guy from Go Daddy reading this thread.
-
They ran a curl test, and it didn't resolve. I think the guy is lost. They've read the thread, hopefully they can fix the issue.
-
Still on the phone.
-
They ran a curl test, and it didn't resolve. I think the guy is lost. They've read the thread, hopefully they can fix the issue.
Did they say they are going to keep working on it? When I was on they could see it wasn't resolving but did not think it was their issue.
Sent from my SM-G960U using Tapatalk
-
It has been moved to am "Engineer". Actually, I'm one of those, so we will see. BTW, there music bites.
-
I was told I had to move to a new server to get it to work. That those of us on older packages where out of luck, they might migrate us for free, but might charge us for the migration. Next person want to try? Good news is, they are aware of the problem now, but don't seem to know why it's happening. (Translation to English).
-
They ran a curl test, and it didn't resolve. I think the guy is lost. They've read the thread, hopefully they can fix the issue.
Did they say they are going to keep working on it? When I was on they could see it wasn't resolving but did not think it was their issue.
Sent from my SM-G960U using Tapatalk
No, they tried to say it was someone else's problem, but then they pushed it onto "old servers". My problem is exchange email. I ether pay them to "migrate" my site and they will "prorate" my billing or otherwise we are out of luck. They are end-of-life those servers. and obviously they don't now what "apt-get update" does... If you can't tell, i'm a little frustrated.
-
It's going to be a lot of work to migrate to another hosting/email provider. It looks like I'm going to have to.
-
I was told I had to move to a new server to get it to work. That those of us on older packages where out of luck, they might migrate us for free, but might charge us for the migration. Next person want to try? Good news is, they are aware of the problem now, but don't seem to know why it's happening. (Translation to English).
Moving my site (and my work one) to one of their "new" servers was pretty much painless. Been with them since 2008. They moved everything over for me at no cost. I did lose my site for a day while the DNS resolved. Had a couple FTP issues figuring out their new setup with a more modern server
but other than that, no problems at all.
PS: I dont use the local metar reports also.
-
I’m using your fix and now it quit working... www.gosportwx.com
-
So now Godaddy is giving this error with the fix/hack - unable to load KMQS data RC=403 Forbidden
-
At version 7.3, I get this error message at the top of the index page:
Warning: Use of undefined constant truee - assumed 'truee' (this will throw an Error in a future version of PHP) in /home/public_html/nws-alerts-config.php on line 58
I have not done any problem solving.
Sorry I didn't see this before. Change the "truee" on that line to "true". Somehow, someone's finger got stuck on the 'e' key.
## ALERT LOGGING
$logAlerts = truee; // true=log alerts false=don't log alerts
-
So now Godaddy is giving this error with the fix/hack - unable to load KMQS data RC=403 Forbidden
Just looked at another site we were debugging on GD and see the same thing. If you try your METAR page (view-source:http://www.gosportwx.com/wxmetar.php?debug=y (http://view-source:http://www.gosportwx.com/wxmetar.php?debug=y)), it's getting a 403 return for all of the METAR fetches. Fetching from elsewhere (other than GD) looks to work just fine.
I would have hit my breaking point a while ago, but having moved sites between providers more than once I probably have a quicker trigger than most.
-
Any hosting preferred over another besides GD? Thanks!
-
I use 1&1 (now 1&1 IONOS). I've been with them for well over a decade and kinda know the warts and how to work with them. But for the most part, stuff just works and I don't need to bother them much. Maybe once a year, if that.
At one point I also used Site5 just so that none of the sites for which I was doing volunteer work would crater if 1&1 did have a meltdown. They were great but then acquired by Endurance International Group (EIG) and the customer service went right out the window. I then found out that this was par for the course for EIG and highly recommend avoiding ANY of their hosting companies.
https://www.reviewhell.com/blog/endurance-international-group-eig-hosting/ (https://www.reviewhell.com/blog/endurance-international-group-eig-hosting/)
I was looking for another backup after jumping ship off Site5, but 1&1 has been very stable so I never actually followed through on the research I did (that was 3 years or so ago).
-
While I don't have this particular script, I have this one that pulls METAR data. It's working fine for me.
https://www.bismarckweather.net/wxmetar.php
I looked back at my old emails. I moved to GD "new" servers back in 2017. I was told quite awhile ago the server I was on was going to be gone. Maybe not?
I also updated my site to PHP 7.3 from 7.2 this past weekend. None of my pages are having errors.
-
I’m using your fix and now it quit working... www.gosportwx.com
Looks like the NWS is blocking requests from your server:
<!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KHUF' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/KHUF.TXT' -->
<!-- HTTP stats: RC=403 dest=140.90.101.79 port=443 (from sce=72.167.190.6)
Times: dns=0.000 conn=0.000 pxfer=0.000 get=0.057 total=0.057 secs -->
<!-- headers returned:
HTTP/1.1 403 Forbidden
Date: Wed, 02 Oct 2019 19:54:06 GMT
Server: Apache
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Vary: Accept-Encoding
Content-Length: 243
Content-Type: text/html; charset=iso-8859-1
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
-->
<!-- mtr_conditions returns RC='403 Forbidden' for ICAO/METAR='KHUF' -->
<!-- mtr_conditions returns
condwords='unable to load KHUF data RC=403 Forbidden', iconnum=5, condicon='day_partly_cloudy.gif',
condicondesc='unable to load KHUF data RC=403 Forbidden'
metarUpdated = 1570046046 2019-10-02 07:54 GMT (Wed, 02-Oct-2019 3:54pm EDT)
-->
Maybe the NWS has just blacklisted 72.167.190.6 which is an egress server for GoDaddy. Your website is on 160.153.94.162
Ed, your issue is a bit different. The lookup for tgftp.nws.noaa.gov is failing <!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KMQS' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/KMQS.TXT' -->
<!-- curl Error: Could not resolve host: tgftp.nws.noaa.gov -->
<!-- HTTP stats: RC=0 dest= port=0 (from sce=)
Times: dns=0.000 conn=0.000 pxfer=0.000 get=0.003 total=0.003 secs -->
<!-- headers returned:
-->
<!-- mtr_conditions returns RC='' for ICAO/METAR='KMQS' -->
You should install Jasiu's fix (above) to get-metar-conditions-inc.php and see what happens then (the fix is not currently installed on your website).
-
Any hosting preferred over another besides GD? Thanks!
This link might be helpful. Hosting providers (http://www.wxforum.net/index.php?topic=36893.msg379468#msg379468)
Doug
-
Ken - I was running his fix and got the forbidden error today. The fix ran fine for a day or so. I removed it to see if that was the culprit. I've since put it back. The page still has an error but the error loads faster (lol)
-
Ed,
You're getting the 403 error, but from a different GoDaddy egress server <!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KMQS' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/KMQS.TXT' -->
<!-- HTTP stats: RC=403 dest=140.90.101.79 port=443 (from sce=72.167.190.51)
Times: dns=0.000 conn=0.056 pxfer=0.198 get=0.060 total=0.257 secs -->
<!-- headers returned:
HTTP/1.1 403 Forbidden
Date: Wed, 02 Oct 2019 20:25:38 GMT
Server: Apache
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Vary: Accept-Encoding
Content-Length: 243
Content-Type: text/html; charset=iso-8859-1
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
-->
<!-- mtr_conditions returns RC='403 Forbidden' for ICAO/METAR='KMQS' -->
My guess is that the NWS webmaster for tgftp.nws.noaa.gov has put in a firewall block for queries from 72.167.190.0/24 block (which I think is the set of egress servers based on the whois registration (a /24 network).
Are you also on an 'old' GoDaddy hosting (and not a newer cPanel hosting)?
-
My http://www.komokaweather.ca/wxmetar.php (http://www.komokaweather.ca/wxmetar.php) had worked after doing the mod a couple weeks ago, but now get the RC=403 Forbidden. I have not yet contacted GoDaddy. I upgraded to GoDaddy cPanel in 2018.
Enjoy,
Paul
-
This is nuts.. i refuse to switch hosts because its a pain in the butt. there has to be some kinda fix for this!!
-
I have a cPanel host on GoDaddy. Just ran some requests via terminal:
swnmaint2@a2plcpnl0172 [~]$ hostname
a2plcpnl0172.prod.iad2.secureserver.net
swnmaint2@a2plcpnl0172 [~]$ hostname -i
198.71.224.69
$swnmaint2@a2plcpnl0172 [~]$ nslookup tgftp.nws.noaa.gov
Server: 10.255.250.30
Address: 10.255.250.30#53
Non-authoritative answer:
tgftp.nws.noaa.gov canonical name = tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov canonical name = tgftp.bldr.ncep.noaa.gov.
Name: tgftp.bldr.ncep.noaa.gov
Address: 140.172.138.79
swnmaint2@a2plcpnl0172 [~]$ cat /etc/resolv.conf
# Generated via Puppet
nameserver 10.255.250.30
nameserver 10.255.251.30
options rotate
swnmaint2@a2plcpnl0172 [~]$ curl -v https://tgftp.nws.noaa.gov/data/observations/metar/stations/KHUF.TXT
* About to connect() to tgftp.nws.noaa.gov port 443 (#0)
* Trying 140.172.138.79... connected
* Connected to tgftp.nws.noaa.gov (140.172.138.79) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA
* Server certificate:
* subject: CN=dev.ncep.noaa.gov,OU=Domain Control Validated
* start date: Feb 13 16:27:33 2019 GMT
* expire date: Jan 27 15:33:01 2020 GMT
* common name: dev.ncep.noaa.gov
* issuer: CN=Go Daddy Secure Certificate Authority - G2,OU=http://certs.godaddy.com/repository/,O="GoDaddy.com, Inc.",L=Scottsdale,ST=Arizona,C=US
> GET /data/observations/metar/stations/KHUF.TXT HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.27.1 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: tgftp.nws.noaa.gov
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Wed, 02 Oct 2019 21:30:24 GMT
< Server: Apache
< X-Frame-Options: SAMEORIGIN
< X-Content-Type-Options: nosniff
< X-XSS-Protection: 1; mode=block
< Last-Modified: Wed, 02 Oct 2019 20:55:28 GMT
< ETag: "4f73a8c-5d-593f3b05286dd"
< Accept-Ranges: bytes
< Content-Length: 93
< Vary: Accept-Encoding
< Content-Type: text/plain; charset=UTF-8
< Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
<
2019/10/02 20:53
KHUF 022053Z 24009KT 10SM FEW055 33/20 A2990 RMK AO2 SLP120 T03280200 56013
* Connection #0 to host tgftp.nws.noaa.gov left intact
* Closing connection #0
swnmaint2@a2plcpnl0172 [~]$
and it worked fine on the actual webserver. Note that the server has two private IP address DNS resolvers.
Paul, you appear to be using a different GoDaddy egress server <!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='CYXU' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/CYXU.TXT' -->
<!-- HTTP stats: RC=403 dest=140.90.101.79 port=443 (from sce=50.62.176.93)
Times: dns=0.000 conn=0.060 pxfer=0.203 get=0.066 total=0.269 secs -->
<!-- headers returned:
HTTP/1.1 403 Forbidden
Date: Wed, 02 Oct 2019 21:35:37 GMT
Server: Apache
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Vary: Accept-Encoding
Content-Length: 243
Content-Type: text/html; charset=iso-8859-1
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
-->
<!-- mtr_conditions returns RC='403 Forbidden' for ICAO/METAR='CYXU' -->
<!-- mtr_conditions returns
condwords='unable to load CYXU data RC=403 Forbidden', iconnum=5, condicon='day_partly_cloudy.gif',
condicondesc='unable to load CYXU data RC=403 Forbidden'
metarUpdated = 1570052137 2019-10-02 09:35 GMT (02/10/2019 17:35)
-->
That IP is part of the 50.62.0.0/15 IP space assigned to GoDaddy, and 50.62.176.93 is not the IP of your site (23.229.161.0) so it's an egress server also.
-
This is nuts.. i refuse to switch hosts because its a pain in the butt. there has to be some kinda fix for this!!
There seems to be two issues:
1) the caching DNS servers for some GoDaddy websites cannot reliably do a IP lookup (gethostbyname()) for tgftp.nws.noaa.gov -- that would seem to be firmly in GoDaddy's court to solve. With the existing data from y'all, it seems to be on older GoDaddy shared hosting for that issue.
Jasiu's bypass fix does address that problem.
2) a new problem (403-Forbidden) returns from tgftp.nws.noaa.gov appear to be cause by firewall rules likely implemented by the NWS on tgftp.nws.noaa.gov.
The NWS server is in the netblock NetRange: 140.172.0.0 - 140.172.255.255
CIDR: 140.172.0.0/16
NetName: NOAA-BOULDER
NetHandle: NET-140-172-0-0-1
Parent: NET140 (NET-140-0-0-0-0)
NetType: Direct Assignment
OriginAS:
Organization: NOAA-Boulder (NOAABO)
they list
OrgTechHandle: RN326-ARIN
OrgTechName: NOAA-Boulder NOC
OrgTechPhone: +1-812-856-7477
OrgTechEmail: nb-noc@noaa.gov
as the Tech/Abuse contacts for the IP space.
I'll send them a note asking for clarification about the 403-Forbidden responses for GoDaddy IP-space requests.
-
I've sent off the following note:
Gentlebeings,
I have several scripts that access data from https://tgftp.nws.noaa.gov/ for raw METAR data. Lately, several users of the script have begun encountering 403-Forbidden responses from the site for previously-working queries.
The common factor seems to be that requests are being issued from GoDaddy egress servers in the 72.167.190.0/24 or in 50.62.176.0/24 IP addresses. Have you blacklisted requests from those IP ranges? If so, would you please consider allowing access again?
My script uses a User-agent: Mozilla/5.0 (get-metar-conditions-inc.php - saratoga-weather.org)
The script can be viewed at https://saratoga-weather.org/get-metar-conditions-inc.php?sce=view
Thanks in advance for any assistance you can provide.
I'll let you know what they say.
-
I ran a Track DNS from godaddy cpanel and got this.. http://prntscr.com/pe2lgl (http://prntscr.com/pe2lgl)
-
Yep.. that's problem (1) above (failure to resolve tgftp.nws.noaa.gov).
If you have terminal access on your cPanel do a cat /etc/resolv.conf
and it will id the nameservers having the problem.
You can also do a nslookup -debug tgftp.nws.noaa.gov
and it will show the problem also.
Your hostname/ip can be found doing hostname
hostname -i
All that info should be given to GoDaddy tech support to have them resolve the DNS lookup problem.
BTW.. no need to capture images.. just copy/paste the responses as text into a [quote]...[/quote] entry in a post.
-
hi all....
I did the work around fix that jasiu provided a few weeks ago but now today my site is also not getting Metar data once again.....
unable to load KEWB data RC=403 Forbidden is being displayed instead.
I have had Godaddy hosting since E-rice went out as a Host, so it has been quite a few years now.
I have tried calling them and explaining to them a few times and got nowhere with their Stupid Tech Support.
...chris
-
Chris,
You're having problem (2) - the 403-Forbidden response
Your site is showing <!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KEWB' -->
<!-- curl fetching 'https://tgftp.nws.noaa.gov/data/observations/metar/stations/KEWB.TXT' -->
<!-- HTTP stats: RC=403 dest=140.90.101.79 port=443 (from sce=72.167.190.52)
Times: dns=0.000 conn=0.056 pxfer=0.196 get=0.061 total=0.257 secs -->
<!-- headers returned:
HTTP/1.1 403 Forbidden
Date: Wed, 02 Oct 2019 23:20:35 GMT
Server: Apache
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Vary: Accept-Encoding
Content-Length: 243
Content-Type: text/html; charset=iso-8859-1
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
-->
<!-- mtr_conditions returns RC='403 Forbidden' for ICAO/METAR='KEWB' -->
<!-- MB_setDateTimes date='2019-10-02' time=19:19:40' MDY=1 -->
<!-- MB_setDateTimes MBtime='2019 10 02 19 19 October Wednesday' values set -->
<!-- MB_setFeelslike input T,WC,HI,U='60.1,60.1,67.1,°F' cnvt T,WC,HI='15.6,15.6,19.5' feelslike=60 hcWord=Cool -->
and is using the 72.167.190.52 server for GoDaddy egress -- same problem IP range as previously cited.
I've not received a reponse from the NWS yet about my blacklisting query.
-
I've done a bit of extra diagnostics to Jasiu's testdns.php script.
Please try the following on your problem websites (and let me know the URL for the test script)
<?php
error_reporting(E_ALL);
echo "<p>\n";
echo "Running on " . $_SERVER["SERVER_NAME"] . " (" . $_SERVER["SERVER_ADDR"] . ")\n";
echo "</p>\n";
$NWS = "tgftp.nws.noaa.gov";
echo "<p>dns_get_record(\"$NWS\", DNS_A) returns:</p>\n";
$result = dns_get_record($NWS, DNS_A);
echo "<pre>\n";
print var_export($result,true);
echo "</pre>\n";
$hostaddr = gethostbyname($NWS);
echo "<p>gethostbyname(\"$NWS\") returns: '" . $hostaddr . "'</p>\n";
$resolver = file_get_contents('/etc/resolv.conf');
echo "<p>Contents of /etc/resolv.conf nameserver specification</p>\n";
echo "<pre>$resolver</pre>\n";
if(!empty($hostaddr)) {
echo "<p>Output of 'curl -v https://$NWS/data/observations/metar/stations/KSJC.TXT 2>&1' command</p>\n";
echo "<pre>\n";
$curl = system('curl -v https://'.$NWS.'/data/observations/metar/stations/KSJC.TXT 2>&1');
echo "\n";
echo "</pre>\n";
} else {
echo "<p>Curl test skipped as DNS lookup for $NWS failed.</p>\n";
}
?>
-
Ken - I loaded mine at www.atglenweather.net/testdns.php
-
Thanks Ed,
That shows the problem
Running on www.atglenweather.net (166.62.75.129)
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
false <== indicates failure of the query
gethostbyname("tgftp.nws.noaa.gov") returns: 'tgftp.nws.noaa.gov' <== should be an IP address (140.172.138.79)
Contents of /etc/resolv.conf nameserver specification
# Generated via Puppet
nameserver 72.167.234.213 <== these are the IP addresses of the GoDaddy caching DNS servers with the problem!
nameserver 72.167.234.214 <== and both these servers have the problem!!
options rotate
Oddly enough, the native curl command succeeded (but really slowly) where the PHP gethostbyname() lookup failed!
-
Here is my results from my GD server. I'm not having issues, for comparison.
https://www.bismarckweather.net/testdns.php
-
Maybe I've missed something but just uploaded http://www.komokaweather.ca/testdns.php
and doesn't show much.
Enjoy,Paul
-
Paul,
I think that's Jasiu's original script, not the modified one (since it doesn't show the contents of /etc/resolv.conf).
Try uploading it again (the modified script above).
-
Here is my results from my GD server. I'm not having issues, for comparison.
https://www.bismarckweather.net/testdns.php
You're using GoDaddy DNS servers:
# Generated via Puppet
nameserver 10.255.250.30
nameserver 10.255.251.30
options rotate
and they're working. The problem DNS servers (so far) are
# Generated via Puppet
nameserver 72.167.234.213 <== these are the IP addresses of the GoDaddy caching DNS servers with the problem!
nameserver 72.167.234.214 <== and both these servers have the problem!!
options rotate
Thanks for sharing!
-
Here's some more interesting stuff about tgftp.nws.noaa.gov .. the actual DNS settings to get an A (address) record involves two CNAME hops:
dig @8.8.8.8 tgftp.nws.noaa.gov
; <<>> DiG 9.11.4-P2-RedHat-9.11.4-9.P2.el7 <<>> @8.8.8.8 tgftp.nws.noaa.gov
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 19063
;; flags: qr rd ra ad; QUERY: 1, ANSWER: 3, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;tgftp.nws.noaa.gov. IN A
;; ANSWER SECTION:
tgftp.nws.noaa.gov. 271 IN CNAME tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov. 287 IN CNAME tgftp.bldr.ncep.noaa.gov.
tgftp.bldr.ncep.noaa.gov. 21587 IN A 140.172.138.79
;; Query time: 59 msec
;; SERVER: 8.8.8.8#53(8.8.8.8)
;; WHEN: Thu Oct 03 13:15:57 EDT 2019
;; MSG SIZE rcvd: 116
Maybe the failing GoDaddy DNS servers are not able to navigate the two CNAME hops to get the A record with the IP address. :roll: #-o
As a test, for those having the GoDaddy DNS lookup issue issue, try changing the get-metar-conditions-inc.php script from
$host = 'tgftp.nws.noaa.gov';
to $host = 'tgftp.bldr.ncep.noaa.gov';
and see if that restores the functionality.
This fix won't work for the folks having the 403-Forbidden issue (and may just result in you seeing the 403-Forbidden issue too), but it's worth a try.
-
Ken,
The first pass as the testdns script had the ability to set a host (still works at e.g. https://www.hsnpar.com/testdns.php?host=radar3pub.ncep.noaa.gov (https://www.hsnpar.com/testdns.php?host=radar3pub.ncep.noaa.gov)) and pretty much any noaa.gov URL we tried failed in the same way, unfortunately.
-
Paul,
I think that's Jasiu's original script, not the modified one (since it doesn't show the contents of /etc/resolv.conf).
Try uploading it again (the modified script above).
Thanks Ken,
Trying to find the modified script but no success :oops:
Enjoy,
Paul
-
Paul,
I think that's Jasiu's original script, not the modified one (since it doesn't show the contents of /etc/resolv.conf).
Try uploading it again (the modified script above).
Thanks Ken,
Trying to find the modified script but no success :oops:
Enjoy,
Paul
The modified script is in this post (https://www.wxforum.net/index.php?topic=37823.msg390351#msg390351).
-
Ken - I tried the change ($host = 'tgftp.bldr.ncep.noaa.gov';) and things are working again! Thanks!
-
Ken - I tried the change ($host = 'tgftp.bldr.ncep.noaa.gov';) and things are working again! Thanks!
I see on your wxmetar.php page that loading is working there too:
<!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KLNS' -->
<!-- curl fetching 'https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/KLNS.TXT' -->
<!-- HTTP stats: RC=200 dest=140.172.138.79 port=443 (from sce=72.167.190.51)
Times: dns=0.004 conn=0.048 pxfer=0.162 get=0.047 total=0.209 secs -->
<!-- loaded from URL https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/KLNS.TXT -->
<!-- KLNS='2019/10/03 18:53 KLNS 031853Z 10007KT 7SM OVC009 18/16 A3004 RMK AO2 CIG 003V011 SLP169 T01780161' -->
<!-- age=1134 sec '2019-10-03 18:53:00 GMT' -->
and nicely fast also.
That kinda confirms my suspicion that the GoDaddy caching DNS servers weren't handling the two CNAME lookups correctly and just timing out instead of returning the IP address in an A record. Fault is DEFINITELY with GoDaddy and their DNS servers setup for your particular website hosting.
-
Thanks again Ken,
By running the updated http://www.komokaweather.ca/testdns.php (http://www.komokaweather.ca/testdns.php) I get the further details.
So I thought I would try it using our local METAR CYXU instead of KSJC in the script http://www.komokaweather.ca/testdns-1.php (http://www.komokaweather.ca/testdns-1.php) and that does not give a similar result.
Enjoy,
Paul
p.s. www.komokaweather.ca/wxmetar.php is still not working
-
Paul,
Your website is using the problematic GoDaddy DNS servers:
Contents of /etc/resolv.conf nameserver specification
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
You should try changing the get-metar-conditions-inc.php script from
$host = 'tgftp.nws.noaa.gov';
to $host = 'tgftp.bldr.ncep.noaa.gov';
and see if that restores the functionality.
-
Hi Ken,
and yes that restored functionality http://www.komokaweather.ca/wxmetar.php
Great work =D> \:D/
Enjoy,
Paul
-
Hello Ken.....
I placed the testdns.php on my server here......
http://www.gateway2capecod.com/testdns.php
I pretty much got same results as "PaulMy"
So I just did what you suggested......
You should try changing the get-metar-conditions-inc.php script from
Code: [Select]
$host = 'tgftp.nws.noaa.gov';
to
Code: [Select]
$host = 'tgftp.bldr.ncep.noaa.gov';
and see if that restores the functionality.
And Now I am getting Metar Data on wxmetar.php page and also within my homepage dashboard current Conditions section now reading correctly "overcast" instead of the dreaded "....unable to......"
Thanks for the Fix.
wish there was some way that you could call godaddy yourself and and tell whoever you get on the tech support line what needs to be fixed and see what they say. I have tried explaining to them when this happen a few weeks ago but they deny there is any problem on their end and that they cant fix. They told me that my site is running so the server must be fine and that the script must be wrong....lol....and because one particular part of my site is not functioning correctly its not their problem.
I think the tech people all of us here with godaddy have contacted dont have the knowledge or ability to fix what you say is wrong with their servers.
on another note.....is there any weather forum members that are renting to others or have server space I can pay for reliable web host at a fair price like E-rice "Alan" use to do for us? I havent read any threads about anyone doing this but maybe anyone reading this has and can let me know. if not is there any other webhost for just one website domain that you could recommend?
it seems to me that godaddy is really not worth our time anymore,for webhosting, although I will have to keep my domain name with them as people over many years have my site bookmarked and changing it will just confuse/Loose them.
Hope your Voyage was a Great one Ken and Welcome Back.
have a great Evening....Chris
-
Any other success reports? This provides all the info that would be needed to give to GoDaddy. Not that they'll do anything with it....
-
Any other success reports? This provides all the info that would be needed to give to GoDaddy. Not that they'll do anything with it....
hello Jasiu.........as you stated...doubt they will do anything about it. They keep telling me its a issue with my script and for me to contact my web developer.....lol...and that my website itself is up and running so everything on their servers is all set.
What we should do is just say bye bye to godaddy and move to a better webhost with less headaches. I have not had problems like this before with them in the past and they are very well known webhost but this is ridiculous that they cant do something about this which is effecting quite a few of us in here.
for me I just need a fairly cheap webhost for 1 domain as all I have is my saratoga template site up and not any other sub domains for any reason
....chris
-
Hi Chris,
Glad the bypass worked :)
Yes, the voyage on the Viking Sun from Bergen, Norway to Montreal, Canada was great .. we only missed one port (Shetland Islands) due to high winds/high seas so we couldn't visit with tender boats. Food was outstanding, and for a smaller ship (960 passengers) it was surprisingly stable in 3-4 meter seas we'd encountered due to remnants of Dorian flowing eastwards.
Jasiu,
I'll try to formulate a query to their tech support to document our findings .. still don't have a reply from the NWS about the 403-Forbidden issue, but the DNS lookup issue for websites using the GoDaddy resolvers of 72.167.234.213 and 72.167.234.214, it seems their DNS daemons are unable to resolve >1 CNAME hop to return the A record. Also, it's really bad form to have the two DNS servers be in the same subnet..good practice would be to have on separate subnets in separate infrastructure for routing (diversity is good).
-
Ok, I've sent the following to noc@godaddy.com
Gentlefolks,
I have been contacted by multiple people using one of my scripts (get-metar-conditions.php) which obtains
data from the NWS website 'tgftp.nws.noaa.gov'. The common issue was a failure in gethostbyname() PHP call (and nslookup/dig calls on the webservers) to resolve 'tgftp.nws.noaa.gov'. The common /etc/resolv.conf configuration for the websites hosted on GoDaddy was:
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
and queries via those two servers would result in a timeout and gethostbyname('tgftp.nws.noaa.gov') would simply return 'tgftp.nws.noaa.gov' instead of the expected IP address of 140.172.138.79
A dig command on my server (at 1and1) shows:
dig @8.8.8.8 tgftp.nws.noaa.gov
; <<>> DiG 9.11.4-P2-RedHat-9.11.4-9.P2.el7 <<>> @8.8.8.8 tgftp.nws.noaa.gov
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 19063
;; flags: qr rd ra ad; QUERY: 1, ANSWER: 3, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;tgftp.nws.noaa.gov. IN A
;; ANSWER SECTION:
tgftp.nws.noaa.gov. 271 IN CNAME tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov. 287 IN CNAME tgftp.bldr.ncep.noaa.gov.
tgftp.bldr.ncep.noaa.gov. 21587 IN A 140.172.138.79
;; Query time: 59 msec
;; SERVER: 8.8.8.8#53(8.8.8.8)
;; WHEN: Thu Oct 03 13:15:57 EDT 2019
;; MSG SIZE rcvd: 116
I suspect that the two GoDaddy DNS servers at 72.167.234.213 and 72.167.234.214 are failing to follow the two CNAME hops to get the A record (listed for tgftp.bldr.ncep.noaa.gov).
Would you please ask the DNS technical support for assistance to get those servers to respond? Various people have called GoDaddy first-level support to no avail -- the root cause of the problem is NOT in the scripts, it's a DNS server issue, as the script works on hundreds of other sites (including some working GoDaddy sites), just not the ones that use 72.167.234.213 and 72.167.234.214 as /etc/resolv.conf websites.
Thanks in advance for your kind assistance.
I can be reached at this email or by cell phone 408-xxx-xxxx if additional information is needed.
--
Best regards,
Ken True
-
Hi,
Just for your information, my site seems to load as before this issue arose, everything normal, animated icons all working, cache files working, metar fully working. Happy weather user again...….
Nick. dw7240.com
-
Ken your fix "$host = 'tgftp.bldr.ncep.noaa.gov';" worked for me. I'm on Godaddy Cpannel
-
Ken - I wanted to give you a huge THANK YOU....
-
Wow.....
Guess I'll keep my motor mouth shut in future, LOL, here we go again, seemed fine for about a day or so, now it's back to very slow load times and no metar.
COME ON GODADDY...…………...WHAT THE HECK...………..ARN'T OUR SUBSCRIPTIONS ENOUGH TO PAY YOUR ENGINEERS !!!!
Nick. dw7240.com
-
getting another unable to load KBMG data RC= :/
-
Same here no Metar... It's like they are blocking us on purpose!
-
Now your sites are getting
<!-- curl fetching 'https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/KBMG.TXT' -->
<!-- curl Error: Could not resolve host: tgftp.bldr.ncep.noaa.gov -->
<!-- curl Error: Could not resolve host: tgftp.bldr.ncep.noaa.gov -->
<!-- HTTP stats: RC=0 dest= port=0 (from sce=)
Times: dns=0.000 conn=0.000 pxfer=0.000 get=0.003 total=0.003 secs -->
responses. Their DNS servers are really being problematic.
Using my 1and1 server I still see the same DNS entry for tgftp.nws.noaa.gov # dig @8.8.8.8 tgftp.nws.noaa.gov
; <<>> DiG 9.11.4-P2-RedHat-9.11.4-9.P2.el7 <<>> @8.8.8.8 tgftp.nws.noaa.gov
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 5203
;; flags: qr rd ra ad; QUERY: 1, ANSWER: 3, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;tgftp.nws.noaa.gov. IN A
;; ANSWER SECTION:
tgftp.nws.noaa.gov. 141 IN CNAME tgftp.op.ncep.noaa.gov.
tgftp.op.ncep.noaa.gov. 141 IN CNAME tgftp.bldr.ncep.noaa.gov.
tgftp.bldr.ncep.noaa.gov. 21441 IN A 140.172.138.79
;; Query time: 55 msec
;; SERVER: 8.8.8.8#53(8.8.8.8)
;; WHEN: Fri Oct 04 19:30:40 EDT 2019
;; MSG SIZE rcvd: 116
I've not heard from noc@godaddy.com yet about my message to them (posted above). Sigh...
BTW, the DNS for tgftp.bldr.ncep.noaa.gov is only the A record so they should resolve it dig @8.8.8.8 tgftp.bldr.ncep.noaa.gov
; <<>> DiG 9.11.4-P2-RedHat-9.11.4-9.P2.el7 <<>> @8.8.8.8 tgftp.bldr.ncep.noaa.gov
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 58733
;; flags: qr rd ra ad; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;tgftp.bldr.ncep.noaa.gov. IN A
;; ANSWER SECTION:
tgftp.bldr.ncep.noaa.gov. 18231 IN A 140.172.138.79
;; Query time: 51 msec
;; SERVER: 8.8.8.8#53(8.8.8.8)
;; WHEN: Fri Oct 04 19:36:01 EDT 2019
;; MSG SIZE rcvd: 69
-
Same here no Metar... It's like they blocking us on purpose!
I know right this is ridiculous I’m about to drop them like a bad habit!!!!
-
I've done some minor updates to the testdns.php script -- please update your copies.
<?php
error_reporting(E_ALL);
echo "<p>\n";
echo "Running on " . $_SERVER["SERVER_NAME"] . " (" . $_SERVER["SERVER_ADDR"] . ")\n";
echo "<br>PHP Version: ".phpversion();
echo "</p>\n";
$resolver = file_get_contents('/etc/resolv.conf');
echo "<p>Contents of /etc/resolv.conf nameserver specification</p>\n";
echo "<pre>$resolver</pre>\n";
$NWS = "tgftp.nws.noaa.gov";
echo "<p>dns_get_record(\"$NWS\", DNS_A) returns:</p>\n";
$result = dns_get_record($NWS, DNS_A);
echo "<pre>\n";
print var_export($result,true);
echo "</pre>\n";
$hostaddr = gethostbyname($NWS);
echo "<p>gethostbyname(\"$NWS\") returns: '" . $hostaddr . "'</p>\n";
if(preg_match('!^\d+\.\d+!',$hostaddr)) {
echo "<p>Output of 'curl -v https://$NWS/data/observations/metar/stations/KSJC.TXT 2>&1' command</p>\n";
echo "<pre>\n";
$curl = system('curl -v https://'.$NWS.'/data/observations/metar/stations/KSJC.TXT 2>&1');
echo "\n";
echo "</pre>\n";
} else {
echo "<p>Curl test skipped as DNS lookup for $NWS failed.</p>\n";
}
?>
-
running http://www.gosportwx.com/testdns.php here's what I got, idk if i did it right so let me know lol..
Running on www.gosportwx.com (160.153.94.162)
PHP Version: 7.2.20
Contents of /etc/resolv.conf nameserver specification
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
array (
0 =>
array (
'host' => 'tgftp.bldr.ncep.noaa.gov',
'class' => 'IN',
'ttl' => 2082,
'type' => 'A',
'ip' => '140.172.138.79',
),
)
gethostbyname("tgftp.nws.noaa.gov") returns: '140.172.138.79'
Output of 'curl -v https://tgftp.nws.noaa.gov/data/observations/metar/stations/KSJC.TXT 2>&1' command
* About to connect() to tgftp.nws.noaa.gov port 443 (#0)
* Trying 140.172.138.79... connected
* Connected to tgftp.nws.noaa.gov (140.172.138.79) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA
* Server certificate:
* subject: CN=dev.ncep.noaa.gov,OU=Domain Control Validated
* start date: Feb 13 16:27:33 2019 GMT
* expire date: Jan 27 15:33:01 2020 GMT
* common name: dev.ncep.noaa.gov
* issuer: CN=Go Daddy Secure Certificate Authority - G2,OU=http://certs.godaddy.com/repository/,O="GoDaddy.com, Inc.",L=Scottsdale,ST=Arizona,C=US
> GET /data/observations/metar/stations/KSJC.TXT HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.27.1 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: tgftp.nws.noaa.gov
> Accept: */*
>
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:00:10 --:--:-- 0< HTTP/1.1 200 OK
< Date: Fri, 04 Oct 2019 23:56:26 GMT
< Server: Apache
< X-Frame-Options: SAMEORIGIN
< X-Content-Type-Options: nosniff
< X-XSS-Protection: 1; mode=block
< Last-Modified: Fri, 04 Oct 2019 23:55:53 GMT
< ETag: "4f0790a-70-5941e713a415f"
< Accept-Ranges: bytes
< Content-Length: 112
< Vary: Accept-Encoding
< Content-Type: text/plain; charset=UTF-8
< Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
<
{ [data not shown]
112 112 112 112 0 0 10 0 0:00:11 0:00:10 0:00:01 2488* Connection #0 to host tgftp.nws.noaa.gov left intact
* Closing connection #0
2019/10/04 23:53
KSJC 042353Z 33012KT 10SM FEW080 FEW100 23/06 A3008 RMK AO2 SLP186 T02280061 10239 20172 56007
-
My wxmetar.php page seems to be working fine now.. earlier it was giving me a 503 along with that error.. lets see how long this lasts lol
-
mkutche: That testdns.php result shows it's working (even using the problematic DNS resolvers on GoDaddy).
-
Ugh...Looks as if it's down again :-(
-
Ugh...Looks as if it's down again :-(
Yup. I ran Mike's (http://www.gosportwx.com/testdns.php (http://www.gosportwx.com/testdns.php)) and got:
Running on www.gosportwx.com (160.153.94.162)
PHP Version: 7.2.20
Contents of /etc/resolv.conf nameserver specification
# Generated via Puppet
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
false
gethostbyname("tgftp.nws.noaa.gov") returns: 'tgftp.nws.noaa.gov'
Curl test skipped as DNS lookup for tgftp.nws.noaa.gov failed.
-
I've updated my testdns if that helps any. I'm on GD "new".
https://www.bismarckweather.net/testdns.php
-
I've updated my testdns if that helps any. I'm on GD "new".
https://www.bismarckweather.net/testdns.php
You're lucky.. not using the problematic DNS resolvers. You're using an internal network set:
# Generated via Puppet
nameserver 10.255.250.30
nameserver 10.255.251.30
options rotate
-
Running on crowderfarm.com (184.168.229.195)
PHP Version: 5.6.27
Contents of /etc/resolv.conf nameserver specification
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
Warning: dns_get_record(): A temporary server error occurred. in /home/content/80/7511980/html/crowderfarm/TestDNS.php on line 16
false
gethostbyname("tgftp.nws.noaa.gov") returns: 'tgftp.nws.noaa.gov'
Curl test skipped as DNS lookup for tgftp.nws.noaa.gov failed.
-
Hi,
Here is my output from the testdns.php file...………….
Running on dw7240.com (132.148.62.129)
PHP Version: 5.3.24
Contents of /etc/resolv.conf nameserver specification
nameserver 72.167.234.213
nameserver 72.167.234.214
options rotate
dns_get_record("tgftp.nws.noaa.gov", DNS_A) returns:
array (
)
gethostbyname("tgftp.nws.noaa.gov") returns: 'tgftp.nws.noaa.gov'
Curl test skipped as DNS lookup for tgftp.nws.noaa.gov failed.
Thanks,
Nick. dw7240.com
-
I've updated my testdns if that helps any. I'm on GD "new".
https://www.bismarckweather.net/testdns.php
You're lucky.. not using the problematic DNS resolvers. You're using an internal network set:
# Generated via Puppet
nameserver 10.255.250.30
nameserver 10.255.251.30
options rotate
wonder if I call godaddy and tell them to put me on the 10.255.250.30 servers like bismark is on...if they will switch me to them instead of the 72.xxx.xxx.xxx ones
someone a few posts back mentioned they had enough already and switched to 1and1 for their Host....Does anyone know what the cheapest package that 1and1 has that would be good for just 1 website/domain that I have... and and roughly how much that would cost yearly?
I updated my testdns.php file and that can be viewed here if you need it Ken..... http://www.gateway2capecod.com/testdns.php
as of today am getting the unable to read for my metar, which started again a few days ago even after applying the Jasiu fix/work around a few weeks ago.
-
It's not Ken's issue... It's Godady's... I actually tried to get the guy to come on here and admit it, but obviously he can't. Even if you have C-panel if you are on an old cluster it won't work. They would like to move you to a new cluster, but if they can they would like to charge you to do it. If you wait they'll just do it.
-
It's not Ken's issue... It's Godady's...
Yes I know.
So I just called godaddy and asked if they could put me on the 10.255.xxx.xxx name servers like Bismark is on and his Metar is working and they said only if I switch from a Shared Hosting plan to a Dedicated One.
My shared plan with no HTTPS is running me about $30.00 a Year...The Dedicated Plan would run me about $240.00 a Year...f**k That !!
Just went to 1and 1 and from what I could see it looks like they would charge about $4.00 a month cancell at anytime. which would be $48.00 a year. Most likely a shared host without Https but Most likely the Metar Data will load correctly unlike Godaddy Hosting.
Actually I think I might just bite the bullet tonight or sunday and sign up for the $4.00 a month Plan with 1and 1, Put copies of all my files on my server/hostng with them and see how it goes. Maybe even get a new domain with them as well for testing purposes and if all goes well just let my website visitors know that instead of my web address being www.gateway2capecod.com it will be www.gateway2capecod.net or something of that nature. Then I can say goodbye to godaddy for webhost and also Domain Name since they Dont want to Help all of us.
....chris
-
I think I spend around $1,200.00 a year with Godaddy. Office365 accounts and web hosting. Most of it is a hobby. I'm considering re-evaluating that after the last few weeks of conversations with them. I don't have time for that, especially when it's their fault.
Let me re-iterate this. This is nothing to do with Ken's or those that collaborated's code. This is solely on the shoulders of Godaddy.
I'm staying with them for now because for years, "It just just worked"...
Thank you Ken and all that contribute to these templates. I edit them to fit my phone but without them I would be lost.
-
Just went to 1and 1 and from what I could see it looks like they would charge about $4.00 a month cancell at anytime. which would be $48.00 a year. Most likely a shared host without Https but Most likely the Metar Data will load correctly unlike Godaddy Hosting.
You do get one SSL certificate with that if I'm reading it correctly.
Actually I think I might just bite the bullet tonight or sunday and sign up for the $4.00 a month Plan with 1and 1, Put copies of all my files on my server/hostng with them and see how it goes. Maybe even get a new domain with them as well for testing purposes and if all goes well just let my website visitors know that instead of my web address being www.gateway2capecod.com it will be www.gateway2capecod.net or something of that nature. Then I can say goodbye to godaddy for webhost and also Domain Name since they Dont want to Help all of us.
When all is ready to go you can always just point your old domain name to the 1&1 servers.
-
www.new.crowderfarm.com, loads instantly again. Test site on godaddy.
Run http://www.new.crowderfarm.com/TestDNS.php
Have to fix the rest of the site, but it seems to work.
-
@wcrowder
They've changed your name servers to the 'good' ones ...
# Generated via Puppet
nameserver 10.255.250.30
nameserver 10.255.251.30
options rotate
Let's see if they do for everybody, or only the ones who threaten to walk !
-
www.new.crowderfarm.com, loads instantly again. Test site on godaddy.
Run http://www.new.crowderfarm.com/TestDNS.php
Have to fix the rest of the site, but it seems to work.
How did you get them to switch servers??
-
I was on a really old Linux hosting package, they moved me to C Panel on another server. They wanted to charge me $99.00 to do that, I talked them out of it.
-
Hello......
well I bit the Bullet and Now am on 1and1 Hosting and also got new Domain with them free for first year. The webHosting is $4.00 a month for one website and came with Free SSL Certificate so my site will now be secure/HTTPS.
The new Domain is:
https://www.gateway2capecod.net
(The old Domain on Godaddy is http://www.gateway2capecod.com)
basically just changed from httpto an https and at end from .com to .net
I have loaded all the files that make up website and used the unaltered get-metar-conditions-inc.php file and now the Metar Data is working as it should.
I have updated my profile in this forum to reflect the new site is someone clicks on the link for it next to my user name. will need to do this in a few other places as well.
I also plan on leaving the old godaddy site partially up but not updating live data so that I can inform my visitors of the new Domain Name Change and they can update their Bookmarks.
Would have been nice to keep the old domain with the new webhost but this way I will no longer have any dealings with godaddy once they expire.
One really nice benefit of Being on 1and1 is that Ken from Saratoga that writes all these Amazing Scripts and Maintains them for all of Us is also on 1and1.
if anyone knows a simple way for me to point visitors from my old domain to the new domain besides just adding a Special Note at the top of my old domain Homepage, please let me know.
Thanks......Chris https://www.gateway2capecod.net
-
Hi Chris,
In your old site, you can add to .htaccess
Redirect 301 / https://www.gateway2capecod.net/
and all visitors will automatically go to your new site.
BTW.. I've changed your entry in the NorthEastern network to use your new site :)
It will take up to an hour to propogate to the global network, but the NEWN now shows your conditions.
-
Hi Chris,
In your old site, you can add to .htaccess
Redirect 301 / https://www.gateway2capecod.net/
and all visitors will automatically go to your new site.
BTW.. I've changed your entry in the NorthEastern network to use your new site :)
It will take up to an hour to propogate to the global network, but the NEWN now shows your conditions.
Hello Ken......Thank you
-
Hello......
well I bit the Bullet and Now am on 1and1 Hosting and also got new Domain with them free for first year. The webHosting is $4.00 a month for one website and came with Free SSL Certificate so my site will now be secure/HTTPS.
The new Domain is:
https://www.gateway2capecod.net
Wow, mines $295.75 per year. Including SSL and dedicated ip. I also have 8 Office365 accounts with them that makes it hard to leave, or I would. I'm back up mostly, I'll have to redo my front page. It did take me awhile to fix all the errors that was blocking getting the padlock symbol, but it's mostly working now.
-
It's not Ken's issue... It's Godady's...
Yes I know.
So I just called godaddy and asked if they could put me on the 10.255.xxx.xxx name servers like Bismark is on and his Metar is working and they said only if I switch from a Shared Hosting plan to a Dedicated One.
....chris
I'm still on shared hosting, with dedicated ip and SSL addons. They are con artists.
-
Chris,
To add to what Ken said, the following in .htaccess on your NEW (1&1) server will redirect http accesses to https:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
-
Chris,
To add to what Ken said, the following in .htaccess on your NEW (1&1) server will redirect http accesses to https:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Hello Jasiu.....
Thanks also for the tip. a few questions....if I was to put the new URL and not use the https is it the same thing and will both the https and http both work for eternity?
I also noticed at top part of my browser pale moon and or microsoft that when I dO Enter the new domain with the https....it still says not secure even though they included a free ssl certificate with my purchase...any idea why this would be happening and how to fix>?
I think the error about non secure is referring to my site domain being private so maybe this is the issue???
also wanted to note to both you and ken..I have added a special message to top of my old URL informing regular visitors to site to update their bookmarks...where as if I just do the htaccess redirect code which will work until my old domain gets shut off completely ..at least this way visitors will have a chance to update their saved bookmark for my site and actually see the new domain and know what has happened.
Thank you...Chris
see attached image....
-
Thanks also for the tip. a few questions....if I was to put the new URL and not use the https is it the same thing and will both the https and http both work for eternity?
I'm not 100% sure what you are asking. If you add the bit I posted to your .htaccess, you can give people the URL with or without the "https://" and they will get secure access. That's what happens on my site (enter lexmaweather.info in your browser and you'll still get an SSL connection).
I also noticed at top part of my browser pale moon and or microsoft that when I dO Enter the new domain with the https....it still says not secure even though they included a free ssl certificate with my purchase...any idea why this would be happening and how to fix>?
I think the error about non secure is referring to my site domain being private so maybe this is the issue???
Scroll a little further down in that image and you get your answer. Some content on your home page (most probably images) is fetched with "http://" URLs. The warning is that not ALL content on the page was fetched with SSL.
also wanted to note to both you and ken..I have added a special message to top of my old URL informing regular visitors to site to update their bookmarks...where as if I just do the htaccess redirect code which will work until my old domain gets shut off completely ..at least this way visitors will have a chance to update their saved bookmark for my site and actually see the new domain and know what has happened.
I assume you are going to end your GoDaddy contract. But that doesn't mean you can't keep the domain name. GD should have instructions on how to transfer the domain (to 1&1). Then, if you want, you can have both names pointing to the same server and no one has to update their bookmarks, etc. If you don't want to have to pay to maintain both URLs, though, you'd have to go with the new one (unless you can convince a 1&1 tech to swap the old one in as your "freebie" - I was successful with that once).
-
I just spent an hour on the phone with Godaddy tech support and have solved nothing on my end. UGH...
-
I just spent an hour on the phone with Godaddy tech support and have solved nothing on my end. UGH...
Same here...
-
Chris,
To add to what Ken said, the following in .htaccess on your NEW (1&1) server will redirect http accesses to https:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
This is what I had to do after editing all external links and images in the source files from http:// to https://.
Example: change "http://icons.wunderground.com/data/640x480/ne_rd_anim.gif" to "https://icons.wunderground.com/data/640x480/ne_rd_anim.gif" on your front page.
This is in my .htaccess file edited to fit your needs.
Lines 2 and 3 are to allow "http:\\" access to the gen-MBtags.php and conds.php files to allow Meteobridge to handle them properly. This snippet should work out of the box if you site is in the root. It appears to be working correctly and I now have a padlock symbol.
Make sure you don't have "RewriteEngine On" in the .htaccess file more then once.
@Jasiu does this look correct to you?
RewriteEngine On
RewriteCond %{THE_REQUEST} !/gen-MBtags\.php
RewriteCond %{THE_REQUEST} !/conds\.php
RewriteCond %{SERVER_PORT} 80
RewriteCond %{HTTP_HOST} ^(www\.)?gateway2capecod\.net
RewriteRule ^(.*)$ https://www.gateway2capecod.net/$1 [R,L]
-
Just a few comments:
1) I modified the config on my meteobridge so that specific rules were not necessary in .htaccess. My conds.php URL has an explicit "https://" in it. I use an "SFTP" rather than "FTP" entry for the tags file and the template URL has an explicit "https://" also.
2) I use the 301 return code (R=301) rather than the default 302 to indicate that the redirect is permanent.
-
Hello Again...
well so far I am pleased with 1and1 as my new Host. Everything is working like it should and I even re-added my usa extreme's script to left sidebar menu that was not working correctly on godaddy server. I probabbly should have just used my old domain with the new webhost but after the bullcrap godaddy put us through and not fixing anything for us loyal customers then the heck with them and they wont be getting another penny from me once the domain and webhost i have from them expires in a few months.
I updated all the places/forums etc etc with my new domain url and also have the note on my old godaddy ".com" site for all visitors to update their bookmark for my site so hopefully everyone finds me at the .net site now. also updated the links for my domain on things like my PWS page, Awekas Page, CWOP Page so all visitors to these places will get to the new domain.
If anyone else is sick of the godaddy crap like I was and wants to switch...just go here...https://www.ionos.com/hosting/web-hosting and choose the ESSENTIAL Package for $4.00/Month. not a bad deal and if all you have is one domain/website the essential package is all you need.
I still miss the good ol days when Alan from E-RICE Webhost was only Charging us $25.00 a year for Our sites to be online and everything ran Smoothly.
Have a Good Evening....Chris
-
It seems as if the METAR issue has been solved, atleast for me anyway.. It's been a week and they haven't messed up yet.
-
Low and behold my new web hosting is now (as of today) getting "unable to load KMQS data RC=403 Forbidden"
www.atglenweather.com it's been working great since I switched from GoDaddy to Hostgator (about a month).
-
Low and behold my new web hosting is now (as of today) getting "unable to load KMQS data RC=403 Forbidden"
www.atglenweather.com it's been working great since I switched from GoDaddy to Hostgator (about a month).
I am getting the same thing and I just moved from Go Daddy to 1 and 1.
Sent from my SM-G960U using Tapatalk
-
Low and behold my new web hosting is now (as of today) getting "unable to load KMQS data RC=403 Forbidden"
www.atglenweather.com it's been working great since I switched from GoDaddy to Hostgator (about a month).
I'm seeing in a view-source with ?debug=y <!-- get-metar-conditions-inc.php - Version 1.17 - 30-Nov-2018 -->
<!-- mtr_conditions using METAR ICAO='KMQS' -->
<!-- curl fetching 'https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/KMQS.TXT' -->
<!-- HTTP stats: RC=403 dest=140.172.138.79
Times: dns=0.000 conn=0.069 pxfer=0.319 get=0.060 total=0.379 secs -->
<!-- headers returned:
HTTP/1.1 403 Forbidden
Date: Wed, 06 Nov 2019 20:18:23 GMT
Server: Apache
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Vary: Accept-Encoding
Content-Length: 243
Content-Type: text/html; charset=iso-8859-1
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
-->
<!-- mtr_conditions returns RC='403 Forbidden' for ICAO/METAR='KMQS' -->
on your page. It may be that the NWS site has (temporarily) declared your hostgator website IP address as 'hostile' (or the hostgator egress server) and is blocking the requests. It's something for hostgator tech support and the NWS to resolve, I fear.
It's working fine on my site, and also on CamarilloWX's site at this time.
-
Thanks Ken and I'll approach Hostgator tomorrow and see what happens.
-
Odd how this seems to be a reoccurring issue across web hosting. Is there an alternative to posting the metar data?
-
Couple of options...
1) https://aviationweather.gov/metar/data
This would involve an HTML scrape - pulling everything within the <code> tag.
Example:
https://aviationweather.gov/metar/data?ids=KBOS&format=raw&hours=0&taf=off&layout=off (https://aviationweather.gov/metar/data?ids=KBOS&format=raw&hours=0&taf=off&layout=off)
returns (look for "data starts here"):
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<base href="aviationweather.gov" /><title>AWC - METeorological Aerodrome Reports (METARs)</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" >
<meta http-equiv="content-type" content="text/html; charset=utf-8" >
<meta http-equiv="Refresh" content="900" >
<meta name="DC.language" content="en-us" scheme="DCTERMS.RFC1766" >
<meta name="description" content="Aviation Weather Center Homepage provides comprehensive user-friendly aviation weather Text products and graphics." >
<meta name="keywords" content="aviation, weather, icing, turbulence, convection, pirep, metar, taf, airmet, sigmet, satellite, radar, surface, wind, temperature, aloft, airplane, NEXRAD, GOES, WSR-88D, precipitation, rain, snow, sleet, thunderstorm, en-route, prognosis, chart" >
<meta name="DC.title" content="AWC - Aviation Weather Center" >
<meta name="DC.description" content="Aviation Weather Center Home Page ... METARs Page" >
<meta name="DC.creator" content="NOAA's National Weather Service - Aviation Weather Center Homepage" >
<meta name="DC.date.created" content="2013-03-15" scheme="ISO8601" >
<meta name="DC.date.reviewed" content="2013-03-15" scheme="ISO8601" >
<link href="//code.jquery.com/ui/1.10.4/themes/smoothness/jquery-ui.css" media="screen" rel="stylesheet" type="text/css" >
<link href="/images/favicon.ico" rel="shortcut icon" >
<link href="/images/noaa_logo.png" rel="apple-touch-icon" >
<link href="/css/layout.css" media="" rel="stylesheet" type="text/css" >
<script type="text/javascript" src="//code.jquery.com/jquery-1.9.1.js"></script>
<script type="text/javascript" src="//code.jquery.com/ui/1.10.3/jquery-ui.js"></script>
<script type="text/javascript" src="/lib/OpenLayers-2.13.1/OpenLayers.js"></script>
<script type="text/javascript" src="/javascript/awc_openlayers.js"></script>
<script type="text/javascript" async="true" id="_fed_an_ua_tag" src="https://dap.digitalgov.gov/Universal-Federated-Analytics-Min.js?agency=DOC&subagency=AWC&pua=ua-33523145-2"></script>
</head>
<body >
<div id="awc_pagewrapper">
<!-- Main content -->
<div id="awc_main">
<div id="awc_main_content">
<script type="text/javascript">
function goto(format,hours, taf, layout){
var id_elem = document.getElementById("ids");
var url = "/metar/data?ids="+id_elem.value+"&format="+format+"&hours="+hours+"&taf="+taf+"&layout="+layout;
window.location = url;
}
</script>
<div id="title">METAR Data<div id="title_linkbar">
<div class='title_link'><a href='/metar?date=' title = 'METAR Home'>METAR Home</a></div>
<div class='title_link'><a href='/metar/plot?date=' title = 'METAR Plots'>Plot</a></div>
<div class='title_link on_page'>Data</div>
<div class='title_link'><a href='/metar/board?date=' title = 'METAR Board'>Board</a></div>
<div class='title_link'><a href='/metar/help?page=text&date=' title = 'METAR Information'>Info</a></div>
</div>
</div>
<div id="awc_main_content_wrap">
<p clear="both">
<strong>Data at: 0123 UTC 07 Nov 2019</strong></p>
<!-- Data starts here -->
<code>KBOS 070054Z 30006KT 10SM CLR 09/M07 A3035 RMK AO2 SLP276 T00891072</code><br/>
<hr width='65%' />
<!-- Data ends here -->
</p>
</div>
</div> <!-- awc_main_content -->
</div> <!-- awc_main -->
</div> <!-- awc_pagewrapper -->
</body>
</html>
2) Probably a cleaner way to get the same info is to use the NWS API "latest observation" endpoint. But this would essentially be a rewrite.
Example:
https://api.weather.gov/stations/KBOS/observations/latest (https://api.weather.gov/stations/KBOS/observations/latest)
This returns JSON and therefore doesn't require a METAR decoder.
-
Here's what I got this morning from tech support at Hostgator:
I have checked the server settings and website logs. I could not find any issues from our end. Since NWS is blocking the IP address, I suggest that you contact them to remove the block. Unfortunately, we do not have an access to the blacklist on NWS.
So it looks like I'm stuck :-(
-
Jasiu - what would a rewrite involve and a cost of a re-write? According to the template web list on Ken's page I'm not the only one having this issue.
Thanks
P.S. On a side note I noticed that some of the metar (Ken's site for example) say "Night time" then the conditions. Mine never did say the time of day... ex - night time, dawn
-
Jasiu - what would a rewrite involve and a cost of a re-write? According to the template web list on Ken's page I'm not the only one having this issue.
It's essentially a replacement of get-metar-conditions-inc.php. The code would be a bit simpler because the values would be taken straight from the JSON returned rather than having to decode a METAR string.
Let's let Ken chime in as this is his code. I could be recruited to to do the initial code (I don't want to be on the hook for maintenance, although I'm not going anywhere soon - I hope!).
P.S. On a side note I noticed that some of the metar (Ken's site for example) say "Night time" then the conditions. Mine never did say the time of day... ex - night time, dawn
Can't say I ever saw that. The icon choice is dependent upon the time of day, so that e.g. clear sky at night gives you a dark image with a moon instead of blue sky and a sun. It may have something to do with that.
-
Thank You for the feedback and it would be greatly appreciated :-) I'd be happy to help with time and knowledge compensation.
-
Jasiu - what would a rewrite involve and a cost of a re-write? According to the template web list on Ken's page I'm not the only one having this issue.
It's essentially a replacement of get-metar-conditions-inc.php. The code would be a bit simpler because the values would be taken straight from the JSON returned rather than having to decode a METAR string.
Let's let Ken chime in as this is his code. I could be recruited to to do the initial code (I don't want to be on the hook for maintenance, although I'm not going anywhere soon - I hope!).
P.S. On a side note I noticed that some of the metar (Ken's site for example) say "Night time" then the conditions. Mine never did say the time of day... ex - night time, dawn
Can't say I ever saw that. The icon choice is dependent upon the time of day, so that e.g. clear sky at night gives you a dark image with a moon instead of blue sky and a sun. It may have something to do with that.
The tgftp.noaa.nws.gov server has been accumulating world-wide METAR reports for years now and was the only public-available source for that raw data (AFAIK). Several aviation based sites now do the same, and ogimet.net does also.
I just checked out the api.weather.gov/stations/{ICAO}/observations/latest feeds for several US and international sites, and it appears that it also has international data.
I'll work up a get-metar-conditions-inc.php script to use it as an option. Thanks for the suggestion.
-
I'll work up a get-metar-conditions-inc.php script to use it as an option. Thanks for the suggestion.
Ken being awesome, as usual! [tup]
-
We would expect no less from Ken... He's always willing to lend me a hand when I'm in need!
-
The tgftp.noaa.nws.gov server has been accumulating world-wide METAR reports for years now and was the only public-available source for that raw data (AFAIK). Several aviation based sites now do the same, and ogimet.net does also.
I just checked out the api.weather.gov/stations/{ICAO}/observations/latest feeds for several US and international sites, and it appears that it also has international data.
I'll work up a get-metar-conditions-inc.php script to use it as an option. Thanks for the suggestion.
Hello Ken,
The api.weather.gov/stations/{ICAO}/observations/latest returns often older data compared to the direct feeds
https://tgftp.nws.noaa.gov/data/observations/metar/stations/KBOS.TXT
https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/KBOS.TXT
2019/11/07 17:54
KBOS 071754Z 22012KT 10SM FEW055 BKN110 OVC140 14/04 A3004 RMK AO2 SLP173 T01440039 10150 20083 58035
https://api.weather.gov/stations/KBOS/observations/latest
"rawMessage": "KBOS 071654Z 21014KT 10SM BKN110 OVC140 14/04 A3008 RMK AO2 SLP186 T01390039",
Sometimes they are almost the same, sometimes 30 minutes or more difference.
===
https://api.weather.gov/stations/EBBR/observations/latest
"rawMessage": "EBBR 071720Z 21010KT CAVOK 07/04 Q0999 NOSIG",
The other two https://tgftp.bldr.ncep.noaa.gov/data/observations/metar/stations/EBBR.TXT
2019/11/07 17:50
EBBR 071750Z 21009KT CAVOK 07/04 Q0999 NOSIG
Wim
-
Thanks for the heads-up about timeliness of the reports on api.weather.gov.
I've done a quick mod to get-metar-conditions-inc.php V1.18 to optionally use api.weather.gov instead of tgftp.nws.noaa.gov.
The 'hack' simply gets the JSON, and uses only the rawdata and timestamp to let the rest of the script do the decoding of the METAR as normal. I didn't have time to use the decoded conditions in the JSON with this code .. this retains compatibility with the other scripts that may use this too.
To test it out:
1) replace get-metar-conditions-inc.php with the attached version 1.18.
2) in Settings.php, add $SITE['useMetarAPI'] = true;
to enable use of api.weather.gov.
Let me know (Ed) how it works for you.
-
P.S. On a side note I noticed that some of the metar (Ken's site for example) say "Night time" then the conditions. Mine never did say the time of day... ex - night time, dawn
Ed, the conditions icon/text string come from Weather-Display webtags .. it's based on a combination of Solar sensor(day) and nearby Metar (night) with overrides for wind/rain. Since you're not using Weather-Display, your template conditions come from the nearby METAR you selected, with overrides for local rain events.
-
Makes sense Ken on the weather display... I'll load the new file here shortly! Thank you!
-
New Script seems to be working! Thanks Ken and I'll monitor it closely over the next couple of day!
-
Any special place in settings.php to insert the $SITE['userMetarAPI'] = true; ?
Thanks
-
No special place for it in Settings.php .. anywhere within the configurable settings area is fine.
And it’s $SITE['useMetarAPI'] = true;
-
Got it... Thanks Ken
-
Oops something's off -
-
Ed, what you are seeing is what Wim described above. I also get old data for some locations.
But script-wise, it's working in both modes (true/false). =D> for Ken's quick work!
Ken, you think one of us should open a ticket w/ NWS on the stale data issue?
-
Jasiu - it is showing data (sometime out of date data but data without errors) However I am missing data for my main metar (KMQS - Coatesville, Pennsylvania, USA Distance from station: ENE 6mi, 10km) I'm not sure what's going on with that...
-
That's what Wim was talking about.. the data on api.weather.gov is not as 'fresh' as the data on tgftp.nws.noaa.gov site. Since you're blocked from using the primary site, some 'stale' data is likely better than no data, I think.
-
I just looked at https://api.weather.gov/stations/KMQS/observations/latest and the issue is no rawMessage data:
"timestamp": "2019-11-07T20:29:00+00:00",
"rawMessage": null,
the rawMessage part of the JSON is supposed to have the raw METAR observation. Since it's empty, there's nothing to decode :(
-
Thanks Ken!
-
the rawMessage part of the JSON is supposed to have the raw METAR observation. Since it's empty, there's nothing to decode
Sounds like another issue to report.
-
I've sent a message to the problem reporting address regarding the timeliness and rawMessage content missing issues.
The timeliness issue seems to be long running. On the FAQ part of the API Docs (https://www.weather.gov/documentation/services-web-api) it says
Delayed observations
An infrastructure issue is causing delayed processing of observation station data. Observations may be intermittently delayed or not available. This is being worked, and will likely take several weeks to resolve.
Updated 09/05/2018
Maybe 'several weeks' in NWS-time is >52 weeks to resolve. :???: #-o
-
As I stated a few weeks back, the Western Region of the NWS has been having problems with wx data obs being updated "on time" and have been conversing with NWS regional headquarters in SLC. I specifically sent this thread to them today to see if the problems are somehow interrelated. Whether he/she takes the time to research it, no idea, but he/she has been over-the-top helpful so far. I'll post the response if I get one.
-
Thanks CW2274!
It looks like Ed has data displaying again.
-
I do have data flowing again Ken and I'm not sure what happened. It started working before we made the switch to the new get-metar program but I needed an alternative. Because we both know this is probably a temporary feel good thing from the NWS and I'll need to switch over to the new program sooner than later!
As always I truly appreciate your help... Thank you!
-
I'll post the response if I get one.
Guess this is better than a sharp stick in the eye, but here's the response from SLC...
Thanks for the link. This issue is not related to us. It looks like this thread is more geared toward a hosting issue, but it seems that the users are on the right track. The http://weather.gov/ API is a good place to get weather data for privately owned websites. Another good source is synoptic. The users will need to register to get a "key".
Cheers,
-
Seems if the issue is popping up again www.gosportwx.com 403 forbidden .. just started happening 2 days ago.. it was working fine until then
-
Yep, I got it too :-( I switched over to Ken's updated get_metar_conditions and things seem to be back on track.
-
Anyone have any ideas why I don't get a METAR report from KMQS? I get data from these two areas using Jasiu's code...
https://api.weather.gov/stations/KMQS/observations/latest
and
https://aviationweather.gov/metar/data?ids=KMQS&format=raw&hours=0&taf=off&layout=off
-
Anyone have any ideas why I don't get a METAR report from KMQS? I get data from these two areas using Jasiu's code...
https://api.weather.gov/stations/KMQS/observations/latest
If you look at the JSON returned:
"rawMessage": null,
This is where Ken's new code pulls the data (rather than extracting from the individual fields, which would require a rewrite).
Here's what it looks like when the field is filled in properly:
"rawMessage": "KBED 101151Z 22008KT 10SM OVC020 13/10 A2971 RMK AO2 SLP074 60000 70056 T01280100 10133 20122 53014",
-
Ahhhh gotcha thanks!
-
Anyone else having Metar issues? Mine hasn't updated since Feb 7th... https://www.atglenweather.com/wxmetar.php
-
Anyone else having Metar issues? Mine hasn't updated since Feb 7th... https://www.atglenweather.com/wxmetar.php
Looks more like individual airports not reporting than the script not working. When I just visited, it was Smoketown and Reading not current, but the other 5 were within the last hour.
Brad
-
Anyone else having Metar issues? Mine hasn't updated since Feb 7th... https://www.atglenweather.com/wxmetar.php
You're using api.weather.gov for the metar source <!-- get-metar-conditions-inc.php - Version 1.18 - 07-Nov-2019 -->
<!-- mtr_conditions using METAR ICAO='KLNS' -->
<!-- curl fetching 'https://api.weather.gov/stations/KLNS/observations/latest' -->
<!-- HTTP stats: RC=200 dest=23.73.146.250
Times: dns=0.000 conn=0.028 pxfer=0.092 get=0.040 total=0.133 secs -->
<!-- loaded from URL https://api.weather.gov/stations/KLNS/observations/latest -->
<!-- KLNS='2020/02/07 16:11 KLNS 071611Z 28023G37KT 10SM -RA SCT018 OVC035 06/02 A2906 RMK AO2 PK WND 29039/1600 P0001 T00560022' -->
<!-- age=192881 sec '2020-02-07 16:11:00 GMT' -->
and it looks like a NWS issue of not updating the observations data there.
You can switch back to using tgftp.nws.noaa.gov instead of api.weather.gov by changing $SITE['useMetarAPI'] = true;
to $SITE['useMetarAPI'] = false;
in Settings.php.
-
Thank you Ken!!!