Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [jetty-dev] Trying to make sense of slow server response times

Thanks a lot for your explanations. This sounds like a sophisticated and useful way to test the server indeed. Is the testing setup publicly available somewhere (I'm just curious)?


On 09.05.22 13:30, Joakim Erdfelt wrote:

  > So firstly, this is a really bad way to benchmark a server:



Ok, but how can I benchmark a server without using (some) client code?
And assuming I can't what kind of client code should I use instead? All
I need to do is sent a GET request and receive the response (containing
a single number).


In the case of 1 server and 1 client, you are not testing the server at all.
You are testing how the OS handles threads, the effectiveness of the OS
loopback networking stack, and the efficiency of the CPU cache.
If you benchmark the server in this kind of setup you will find that it is
barely doing anything and is not breaking a sweat.

When we benchmark, we have to set up at least 5 *machines* generating load
(as clients) to 1 server. (does not matter what the technology for
generating the load is, we still need this)
Having the clients on different hardware from the server is important for
having honest benchmark numbers (for a whole host of reasons).
While performing the benchmarking, we gather metrics at the server side,
and at each individual client side and have a single report for the whole
benchmark.

It's not uncommon to have to significantly scale up the number of client
machines in order to get real values from the server configuration, it all
depends on what the server webapps are doing to know what is needed on the
client side.

Joakim Erdfelt / joakim@xxxxxxxxxxx


_______________________________________________
jetty-dev mailing list
jetty-dev@xxxxxxxxxxx
To unsubscribe from this list, visit https://www.eclipse.org/mailman/listinfo/jetty-dev


Back to the top