Sending multiple requests in single SSL connection

unjc email unjc.email at gmail.com
Thu Feb 7 20:14:52 UTC 2013


Hi Alex,

I am following what you have advised and work on a normal HTTP load
first.  I am able to get the persistent connections working - when I
examine the tcpdump, I see there are multiple HTTP requests being sent
by a robot before the connection is closed.  However, the HTTP
requests found in a single TCP stream are addressed for different
hosts (google.com, yahoo.com... ).  What is the trick to make robots
to send multiple requests for the same host (e.g. google.com) per
persistent connection so that the persistent connections are domain
specific?


Thanks,
Jacky

On Tue, Feb 5, 2013 at 8:06 PM, Alex Rousskov
<rousskov at measurement-factory.com> wrote:
> On 02/05/2013 03:55 PM, unjc email wrote:
>
>> As mentioned I have specified a list of domains for HTTPS requests, do
>> WP robots send few requests against the same host before going to the
>> next one?
>>
>>  AddrMap M2 = {
>>        names = ['google.com:9191','facebook.com:9191','youtube.com:9191'...
>>
>> Request #1: https://google.com:9191/.../t05/_00000002.html
>> Request #2: https://facebook.com:9191/.../t05/_00000003.html
>> Request #3: https://youtube.com:9191/.../t05/_00000004.html
>>
>> If the robots sends HTTPS requests according to the sequence specified
>> in address-map (like the example shown above),
>
> Robots select a random server from the Robot.origins array. If you
> prefer to control the order of requests, you can tell a robot to replay
> a trace file.
>
>
>> then the SSL connection
>> would be terminated before sending new request to the next host, is it
>> correct?
>
> Robots close connections based on configured timeouts, robot connection
> limits, HTTP message properties, next HTTP hop decisions, and various
> errors. I believe that is true for both plain and encrypted connections.
>
> If a robot has to switch from server S1 to server S2 then the persistent
> connection to S1 (if any) may be placed in the pool of idle persistent
> connections, to be reused when the same robot decides to revisit S1
> again (unless it has been purged from the pool due to timeout,
> connection limit, or disconnect).
>
>
>> If I really want to send multiple HTTPS requests via the
>> same SSL connection, do I need to modify the address-map like below?
>>
>>  AddrMap M2 = {
>>        names = ['google.com:9191','google.com:9191','google.com:9191','google.com:9191','facebook.com:9191','facebook.com:9191','facebook.com:9191','youtube.com:9191'...
>
>
> No. Address maps and SSL encryption are not directly related to HTTP
> persistent connection reuse. If your focus is on getting persistent
> connections to work, you need to set pconn_use_lmt and idle_pconn_tout
> options on both robot and server side of the test. If possible, I
> recommend getting that to work without SSL first (just to keep things
> simpler) and then enabling SSL.
>
> Also, I would disable open_conn_lmt to start with and then enable it
> when everything is working.
>
> Finally, I would start with a single robot to make triage easier.
>
>
> HTH,
>
> Alex.
>
>
>
>> On Tue, Dec 18, 2012 at 6:20 PM, Dmitry Kurochkin wrote:
>>> Hi Jacky.
>>>
>>> unjc email <unjc.email at gmail.com> writes:
>>>
>>>> Hello there,
>>>>
>>>> I need some help in configuring SSL session.  The following is what I
>>>> have configured for the robot.  I want to configure the client
>>>> workload to send three or four requests per SSL connection.  With the
>>>> current setting, I found each HTTPS request has its own SSL connection
>>>> and it is closed upon receiving the requested object.  Please advise
>>>> the correct setting to configure robots to make multiple requests in a
>>>> single SSL connection.
>>>>
>>>
>>> Robot config looks good.  Did you set pconn_use_lmt for Server?
>>>
>>>> As you see I have set two domain lists for the clients, one set is for
>>>> HTTP requests and the other set for HTTPS requests.  They are all
>>>> unique domains.  Would there be a problem for robots to reuse SSL
>>>> connections for requesting different objects fromthe same site/domain?
>>>>
>>>
>>> No.
>>>
>>> Regards,
>>>   Dmitry
>>>
>>>> Robot R = {
>>>>       kind = "R101";
>>>>       pop_model = {
>>>>               pop_distr = popUnif();
>>>>       };
>>>>       recurrence = 50%;
>>>>       req_rate = undef();
>>>>       origins = [M1.names, M2.names: 10%];
>>>>       credentials = select(totalMemberSpace, totalRobots);
>>>>       SslWrap wrap1 = {
>>>>               ssl_config_file = "/tmp/ssl.conf";
>>>>               protocols = ["any"];
>>>>               ciphers = ["ALL:HIGH": 100%];
>>>>               rsa_key_sizes = [1024bit];
>>>>               session_resumption = 40%;
>>>>               session_cache = 100;
>>>>       };
>>>>       ssl_wraps = [wrap1];
>>>>       addresses = robotAddrs(authAddrScheme, theBench);
>>>>       pconn_use_lmt = const(2147483647);
>>>>       idle_pconn_tout = idleConnectionTimeout;
>>>>       open_conn_lmt = maxConnPerRobot;
>>>>       http_versions = ["1.0"];
>>>> };
>>>>
>>>> AddrMap M2 = {
>>>>       names = ['affiliate.de:9090','buzzfeed.com:9090','usbank.com:9090'...
>>>>
>>>> AddrMap M2 = {
>>>>       names = ['google.com:9191','facebook.com:9191','youtube.com:9191'...
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Thank you very much,
>>>> Jacky
>>>> _______________________________________________
>>>> Users mailing list
>>>> Users at web-polygraph.org
>>>> http://www.web-polygraph.org/mailman/listinfo/users
>> _______________________________________________
>> Users mailing list
>> Users at web-polygraph.org
>> http://www.web-polygraph.org/mailman/listinfo/users
>>
>



More information about the Users mailing list