Found some more silly-business going on.. robot registration attempts from China, Russia and Ukraine. They're blocked now with 403s and shouldn't even launch a PHP process.
Our hosting offers 15 concurrent PHP processes.. the 16th and above get a 500-error until the concurrent number drops to 15 or below.
The valid (Google, Yahoo!, and Bing) bots all are well behaved and don't clog up the queue much. It's the miscreants that try robo-registration or rogue webcrawlers that don't behave and can use up our PHP threads.
The server hosting WXForum.net also hosts northamericanweather.net (and the automation behind the Global Google map data collection), but that only uses a couple of PHP processes every 5 minutes or so.
I did implement custom error pages for 400, 403, 404 and 500 errors as I too, got tired of seeing the default ones from Apache, and saved a '404' on those default pages when looking for the error page to offer).
I'm watching and analyzing the Apache logs multiple times per day, and adjusting the .htaccess block list accordingly.
Thanks for your patience in getting this issue down to a 'only sometimes' problem soon...
Best regards,
Ken