Performance Benchmarks a Webserver
Apache Benchmark Procedures
Tweak include the KeepAlive, KeepAliveTimeout, and MaxKeepAliveRequests settings. Recommended settings, which can all be set in the httpd.conf file, would be:
Code:
ServerLimit 128
MaxClients 128
KeepAlive On
KeepAliveTimeout 2
MaxKeepAliveRequests 100
Testing a stock Apache configuration (MaxClients is 256, ServerLimit is 256, KeepAliveTimeout is 15) using ab configured to make 1000 requests with a concurrency of 100 consecutive requests would be as follows.
$ ab -n 1000 -c 5 http://10.4.85.106/index.html
Where,
-n 1000: ab will send 1000 number of requests to server 10.4.85.106 in order to perform for the benchmarking session
-c 5 : 5 is concurrency number i.e. ab will send 5 number of multiple requests to perform at a time to server 10.4.85.106
For example if you want to send 10 request, type following command:
$ ab -n 10 -c 2 http://www.test.com/
Please note that 1000 request is a small number you need to send bigger (i.e. the hits you want to test) requests, for example following command will send 50000 requests :
$ ab -k -n 50000 -c 2 http://10.4.85.106/index.html
How do I carry out Web server Static KeepAlive test?
Use -k option that enables the HTTP KeepAlive feature using ab test tool. For example:
$ ab -k -n 1000 -c 5 http://10.4.85.106/index.html
How do I save result as a Comma separated value?
Use -e option that allows to write a comma separated value (CSV) file which contains for each percentage (from 1% to 100%) the time (in milliseconds) it took to serve that percentage of the requests:
$ ab -k -n 50000 -c 2 -e apache2r1.cvs http://10.4.85.106/index.html
How do I import result into excel or gnuplot programs so that I can create graphs?
Use above command or -g option as follows:
$ ab -k -n 50000 -c 2 -g apache2r3.txt http://10.4.85.106/index.html
Sample psql.php (php+mysql) file
$link = mysql_connect(“localhost”, “USERNAME”, “PASSWORD”);
mysql_select_db(“DATABASE”);
$query = “SELECT * FROM TABLENAME”;
$result = mysql_query($query);
while ($line = mysql_fetch_array($result))
{
foreach ($line as $value)
{
print “$value\n”;
}
}
mysql_close($link);
?>
Run ab command as follows:
$ ab -n 1000 -c 5 http://10.4.85.106/psql.php
Script For Load test Apache
Here is my bash script that use wget command to bombard your web servers:
#/usr/bin/bash
for((i=0;i<=100;i++))
do
wget http://192.168.1.7/index.html
done
rm -f index.html*
apache bench max concurrency
echo “10240” > /proc/sys/net/core/somaxconn
or set (to make permanent)
net.core.somaxconn = 10240 (or any number over what you are using as max connections)
in
/etc/sysctl.conf
then run
/sbin/sysctl -p /etc/sysctl.conf
Also set your local ulimit to
ulimit -n 65535
just to have some overhead in case you want to increase the concurrent connections in ab. Let me know if that helps. If you want my thoughts on why I think this fixes things (assuming it helps you) let me know
Also if you move to Worker MPM (you will have to redo php to be threadsafe if you use php) it might lower you CPU usage for a higher RAM usage depending on what your running off your server.
Recent Comments