Changing the IP address range has an influence on the number of robots?
Hohl, Gerrit
g.hohl at aurenz.de
Tue Dec 6 15:41:36 UTC 2011
Hello everyone,
now I have an interesting phenomenon: Changing the IP address range
seems to have an influence on the number of robots.
I cloned my test image 4 times - for 2 clients and 2 servers. I modifed
the systems, so no one of them is in conflict with another system. First
I wanted to try to configure the test for 1 client and 1 server. I
configured the aliases and routes and everything worked fine (including
ping and so on). Then I modified my test by changing the IP address
range in addr_space and added the hosts property. I paid attentation to
that mapping thing which was described in polymix-4 configuration. I
uncommented everything related to DNS so I don't have to care for that.
Then I started server and client (without proxy). The server showed be
that 750 virtual servers have been started - which is okay and exactly
the same as before. But when I started the client, it showed my that it
started 2500 robots. Previously it started only 250 - 2 on each of the
125 aliases. I uncommented the hosts property, but that didn't changed
anything.
Why does changing the IP address range have an effect on the number of
robots? And how?
/*
* The focus of this test is response time and bandwidth under high
load.
* Server and robot don't have a think time or anything like that. They
just
* put as much stress as possible on the proxy.
*/
#include "contents.pg"
#include "cred.pg"
Bench theBench = {
// The peak request rate is 10.000 requests per second
peak_req_rate = 10000/sec;
client_side = {
/*
* The servers should be able to handle 20.000 requests
per
* second.
*/
max_host_load = 20000/sec;
/*
* Every robot should create 4 requests per seconds.
*/
max_agent_load = 4/sec;
addr_space = [ 'lo::10.5.0-123.1-250/22' ];
// hosts = [ '172.16.5.1' ];
};
server_side = {
max_host_load = client_side.max_host_load;
max_agent_load = undef();
addr_space = [ 'lo::10.5.128-251.1-250:80/22' ];
// hosts = [ '172.16.5.128' ];
};
};
PolyMix4As asPolyMix4 = {
// We will have 2 robots on each IP address / alias.
agents_per_addr = 2;
};
DnsResolver resolver = {
servers = [ '127.0.0.1:53' ];
timeout = 5sec;
};
Server server = {
kind = "simple-srv";
contents = [ cntImage: 65%, cntHTML: 15%, cntDownload: 0.5%,
cntOther ];
direct_access = [ cntHTML, cntDownload, cntOther ];
addresses = serverAddrs(asPolyMix4, theBench);
};
AddrMap addrMap = {
zone = "bench.tst";
addresses = server.addresses;
names = ipsToNames(addresses, zone);
};
// a primitive robot
Robot robot = {
kind = "simple-rbt";
pop_model = {
pop_distr = popUnif();
};
req_types = [ "Basic", "Ims200": 5%, "Ims304": 10%, "Reload": 5%
];
req_methods = [ "GET", "POST": 1.5%, "HEAD": 0.1% ];
dns_resolver = resolver;
origins = addrMap.names;
// origins = server.addresses;
addresses = robotAddrs(asPolyMix4, theBench);
post_contents = [ cntSimpleRequest ];
credentials = cred;
};
Phase inc = {
name = "inc";
goal.duration = 5min;
populus_factor_beg = 0%;
populus_factor_end = 100%;
load_factor_beg = 0%;
load_factor_end = 100%;
};
Phase top = {
name = "top";
goal.duration = 10min;
};
Phase dec = {
name = "dec";
goal.duration = 5min;
populus_factor_beg = 100%;
populus_factor_end = 0%;
load_factor_beg = 100%;
load_factor_end = 0%;
};
schedule(inc, top, dec);
// commit to using these servers and robots
use(server, robot);
use(addrMap);
use(theBench);
More information about the Users
mailing list