Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[jetty-users] Limiting number of parallel requests each client can make

Hi,
we're using an embedded jetty http server.
Some requests may take some time to complete - to simplify lets say
each request needs a resource that is only available to 1 thread at a
time - the other threads will wait on `synchronize` before sending the
response back from the server.

Currently one client is able to call this server many times in
parallel and we end up in a state when all the thread pool threads are
used and any other client will be waiting in a queue for a long time
before being served.

Do I understand correctly that this queue where the requests wait
before being assigned a thread is the one inside `ManagedSelector`?
Can I or should I try to change how it works?


Now we would like to limit the number of parallel requests each client
can make - or limit the number of requests per client per second. The
goal is that if a client makes many parallel long running requests,
the other clients will be served normally.

We would like to separate clients from each other based on TLS certificates.


What I found so far:

- LowResourceMonitor -- this can detect when all the threads are used
but I don't think I can use this to limit assigning new threads to the
requests waiting in the queue

- org.eclipse.jetty.servlets.DoSFilter -- I can use this to:
-- reject (I would prefer not to)
-- throttle
-- delay and throttle
Is the throttle or delay in the DoSFilter happening when the request
already has its thread and therefore blocking that thread from being
used by other clients? Is it meant to slow down the client in sending
any future requests but doesn't stop the client from sending parallel
requests?

Would you recommend anything else for my use case?

Thanks a lot.

Regards
Premek


Back to the top