Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [jetty-users] Limiting number of parallel requests each client can make

Inline ...

On Thu, May 6, 2021 at 9:53 AM Přemysl Vyhnal <premysl@xxxxxxxxxx> wrote:
Thanks for your replies,

> You should take a look at AsyncServlet processing, this allows to "sleep" requests with low resource needs
Would that mean rewriting the way we sleep or wait for a resource
everywhere in the code called while serving that request? I'd like to
avoid that as we don't have control over all the code or it could be
complicated to change it at this point.

You would only need to change the 1 servlet that handles that resource constrained behavior.

 

>Put the QoSFilter in front of the servlet url-pattern that is resource starved.
>And then configure the maxRequests to 1.
>
>All requests that are not actively using that resource will be threadlessly queued with AsyncContext.
>When the active request completes, the queue is used to activate the next request.

In *DosFilter* I can overwrite extractUserId method to separate
clients from each other.
Basically I'd like to have separate queues for clients (based on
something I extract from request headers or ssl cert)

If I use *QoSFilter* with maxRequests=1 wouldnt one client block other clients?

Yes, the QoSFilter would impact all clients, as designed.

If you have such a requirement, don't use DoSFilter or QoSFilter, use the AsyncContext properly.
You'll have a queue of requests based on per userId scenario.
The AsyncContext timeouts can be used to fail a request that has been queued for too long as well.

Another approach, if you have control over the client, is to reply with status code 429 (Too Many Requests) when there are too many simultaneous requests to that constrained resource.
Just make sure you set the `Rety-After` header to something that makes sense for your scenario.

- Joakim


Back to the top