Keep Server Online
If you find the Apache Lounge, the downloads and overall help useful, please express your satisfaction with a donation.
or
A donation makes a contribution towards the costs, the time and effort that's going in this site and building.
Thank You! Steffen
Your donations will help to keep this site alive and well, and continuing building binaries. Apache Lounge is not sponsored.
| |
|
Topic: MaxRequestPerChild bug, please help |
|
Author |
|
dke
Joined: 13 Jul 2007 Posts: 61 Location: sweden
|
Posted: Sun 22 Jul '07 0:26 Post subject: MaxRequestPerChild bug, please help |
|
|
Hi all,
I have fine tuned my apache settings for my website and noticed a LARGE increase in performance setting these settings wiht winnt_mpm:
Timeout 30
KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 5
ThreadsPerChild 100
MaxRequestsPerChild 1000
With only 1000 in maxrequestperchild the child process is suppose to DIE and respawn after handling 1000 requests, thing is, it dosnt. It hangs completely and just stays in taskmanager taking up about 50mb ram, and wont let any connections go through.... I have to manually go in and reboot it, but once its up again i get very good preformance. If i put MaxrequestsperChild to 0 it will eventually get slow and not handle my requests fast enough, theres some kind of memory leak in apache i guess..
Any ideas on this error? Error log generates nothing on this. |
|
Back to top |
|
James Blond Moderator
Joined: 19 Jan 2006 Posts: 7371 Location: Germany, Next to Hamburg
|
Posted: Wed 25 Jul '07 15:48 Post subject: |
|
|
What kind of pages do apache serve? pure html? PHP? perl? something else? |
|
Back to top |
|
dke
Joined: 13 Jul 2007 Posts: 61 Location: sweden
|
Posted: Sun 29 Jul '07 2:33 Post subject: |
|
|
hi there, im experimenting with a coppermine gallery page which uses php and mysql... i think the problem is related to the large number of thumbnails being generated from each page, is there anyway to put the thumbnails in a cache for faster loading?
also i somehow resolved the issue with apache service not being able to restart the child server, but performance is still peaking quickly after 10 users are browsing, tho i dont see any performance hits on the machine itself, is there anyway for me to maximize apache to consume more recources from the machine?
thanks for the reply james!
EDIT: Would like to add that it seems that the Parent server only send the stop and restart signal for the child process 6 times, after that the child process seems static, im only able to generate up to "Parent Server Generation: 6" and after that it wont get the restart signal anymore, now its consuming 700mb ram cause it dosnt get the restart signal, seems i have to manually shut down apache and restart it again, then it sucessfully restarts the child process 6 times (6x1000 hits) before it gets static again, is this a setting in the compilation code?
If i dont use MaxRequestsPerChild 1000 my server gets rapidly slower fast! Sounds like a memory leak to me, but ive tested 5-6 different builds of apache and i get the same trouble with all..
Also ive tested fastCGI without any difference, i get exactly the same results, its somthing in apache thats not working properly... |
|
Back to top |
|
James Blond Moderator
Joined: 19 Jan 2006 Posts: 7371 Location: Germany, Next to Hamburg
|
Posted: Sun 29 Jul '07 16:27 Post subject: |
|
|
There must be way to save the generated thumbs. Are the thumbs created by PHP GD lib? For heavy usage I found in imagemagic a solution. It is a command line tool. Ok there is maybe some recoding of the PHP, but could put the generated thumbs in a folder. if the thumb exist only show else generate and show than. Maybe you can do that without installing imagemagic modify the existing code.
Are the thumbs deleted after each page view?
of cause you could also caching the generated html code. If written a realy simple cache for PHP, but it does not what for any POST or GET, it simple cache the page
Code: |
<?php
function cache_start()
{
global $cache_file_name, $age;
// a superbly creative way for creating cache files
$cache_file_name = __FILE__ . '_cache';
// default cache age
if (empty($age)) $age = 600;
// check if cache exists and if the cached data is still valid
if (@filemtime($cache_file_name) + $age > time()) {
// Yey! cache hit, output cached data and exit
readfile($cache_file_name);
unset($cache_file_name);
exit;
}
// nothing in cache or cache is too old
ob_start();
}
function cache_end()
{
global $cache_file_name;
// nothing to do
if (empty($cache_file_name)) return;
// fetch output of the script
$str = ob_get_clean();
// output data to the user, so they don't need to wait
// for the cache writing to complete
echo $str;
// write to cache
fwrite(fopen($cache_file_name.'_tmp', "w"), $str);
// atomic write
rename($cache_file_name.'_tmp', $cache_file_name);
}
cache_start();
// set cache termination code as the exit handler
// this way we don't need to modify the script
register_shutdown_function("cache_end");
?>
|
To your fcgi PHP solution. How did you setup the fcgi? |
|
Back to top |
|
dke
Joined: 13 Jul 2007 Posts: 61 Location: sweden
|
Posted: Sun 29 Jul '07 18:06 Post subject: |
|
|
Hi James,
Im sorry for the confusion, to say "lots of thumbs being generated per page" was wrong of me, all thumbs are stored on the harddrive and are ONLY generated by GD2 when i add them (i have no issues with this process)
I configured fcgi the following way (the only way i know how)
from httpd.conf:
LoadModule fcgid_module modules/mod_fcgid.so
AddHandler fcgid-script .php
DefaultInitEnv PHPRC "c:/progra~1/php/"
DefaultInitEnv SystemRoot "C:/Windows"
DefaultInitEnv SystemDrive "C:"
DefaultInitEnv TEMP "C:/WINDOWS/TEMP"
DefaultInitEnv TMP "C:/WINDOWS/TEMP"
DefaultInitEnv windir "C:/WINDOWS"
MaxRequestsPerProcess 500 <-- was recommended in a FAQ for fastcgi
<Directory "C:/mywebpage">
FCGIWrapper "c:/progra~1/php/php-cgi.exe" .php
Options ExecCGI
AllowOverride None
Order allow,deny
Allow from all
</Directory>
ive swapped bvack and fourth from fcgi to php5apache2 and ive noticed a slight performance increase using php5apache2 module insted of fcgi, but the same problem does occur sooner or later.
Id like to add that performance wise on the machine i see no bottlenecks, i have lots of ram left so the machine shouldnt be swapping for thumbs (i dont know how to check if it is?) also i get no cpu peaks..
thanks so much for your help james.
EDIT:
could add that i found a slight performance increase using mod_expires.so im not sure what issues this can lead to, maybe theres more caching modules i could take use of?
<IfModule mod_expires.so>
ExpiresActive On
ExpiresDefault "access plus 1 seconds"
ExpiresByType text/html "access plus 1 seconds"
ExpiresByType image/gif "access plus 120 minutes"
ExpiresByType image/jpeg "access plus 120 minutes"
ExpiresByType image/png "access plus 120 minutes"
ExpiresByType text/css "access plus 60 minutes"
ExpiresByType text/javascript "access plus 60 minutes"
ExpiresByType application/x-javascript "access plus 60 minutes"
ExpiresByType text/xml "access plus 60 minutes"
</IfModule>
I also found this interesting read on a site:
Requests vs. Client Connections
On any given connection, to load a page, a client may request many URLs: page, site css files, javascript files, image files, etc.
Multiple requests from one client in rapid succession can have the same effect on a Server as "concurrent" connections [threaded MPMs and directive KeepAlive taken into consideration]. If a particular website requires 10 requests per page, 10 concurrent clients will require MPM settings that are geared more towards 20-70 clients. This issue manifests itself most under a process-based MPM [prefork].
This might just be my issue, this site only contained that text, but no solution for the problem, i guess its pretty specific on what site you run, but i probably have around 20 requests per page or more, what is my optimal mpm setting for that? |
|
Back to top |
|
|
|
|
|
|