private support for your internal/customer projects ... custom extensions and distributions ... versioned snapshots for indefinite support ... scalability guidance for your apps and Ajax/Comet projects ... development services from 1 day to full product delivery
The simpler way to perform a HTTP request is the following:
HttpClient.GET(...) performs a HTTP GET request to the given URI and returns a
ContentResponse when the request/response conversation completes successfully.
ContentResponse object contains the HTTP response information: status code, headers
and possibly a content. The content length is limited by default to 2 MiB; for larger content see
Response Content Handling.
If you want to customize the request, for example by issuing a HEAD request instead of a GET, and simulating a browser user agent, you can do it in this way:
This is a shorthand for:
You first create a request object using
httpClient.newRequest(...), and then you customize it
using the fluent API style (that is, chained invocation of methods on the request object). When the request object
is customized, you call
Request.send() that produces the
the request/response conversation is complete.
Simple POST requests also have a shortcut method:
The POST parameter values are automatically URL-encoded.
Jetty HTTP client automatically follows redirects, so automatically handles the typical web pattern POST/Redirect/GET, and the response object contains the content of the response of the GET request. Following redirects is a feature that you can enable/disable on a per-request basis or globally.
File uploads also require one line, and make use of JDK 7′s
It is possible to impose a total timeout for the request/response conversation using the
Request.timeout(...) method, in this way:
In the example above, when the 5 seconds expire, the request is aborted and a
java.util.concurrent.TimeoutException is thrown.
So far we have shown how to use Jetty HTTP client in a blocking style, that is the thread that issues the request blocks until the request/response conversation is complete. In this section we will look at Jetty HTTP client asynchronous, non-blocking, APIs that are perfectly suited for large content downloads, for parallel processing of requests/responses and in all those cases where performance and efficient thread and resource utilization is a key factor.
The asynchronous APIs rely heavily on listeners that are invoked at various staged of request and response processing. These listeners are implemented by applications and may perform any kind of logic. The implementation invokes these listeners in the same thread that is used to process the request or response. Therefore, if the application code in these listeners takes a long time to execute, the request or response processing is delayed until the listener returns.
If you need to execute application code that takes long time inside a listener, you must spawn your own thread, and remember to deep copy any data provided by the listener that you will need in your code, because when the listener returns the data it provides may be recycled/cleared/destroyed.
Request and response processing are executed by two different threads and therefore may happen concurrently. A typical example of this concurrent processing is an echo server, where a large upload may be concurrent with the large download echoed back. As a side note, remember that responses may be processed and completed before requests; a typical example is a large upload that triggers a quick response - for example an error - by the server: the response may arrive and be completed while the request content is still being uploaded.
The application thread that calls
Request.send(CompleteListener) performs the processing
of the request until either the request is fully processed or until it would block on I/O, then it returns (and
therefore never blocks). If it would block on I/O, the thread asks the I/O system to emit an event when the
I/O will be ready to continue, then returns. When such an event is fired, a thread taken from the
HttpClient thread pool will resume the processing of the request.
Response are processed either from the I/O system thread that fires the event that bytes are ready to
be read or by a thread taken from the
HttpClient thread pool (this is controlled by the
HttpClient.isDispatchIO() property). Response processing continues until either the response is
fully processed or until it would block for I/O. If it would block for I/O, the thread asks the I/O system to
emit an event when the I/O will be ready to continue, then returns. When such an event is fired, a thread taken
HttpClient thread pool will resume the processing of the response.
When the request and the response are both fully processed, the thread that finished the last processing (usually the thread that processes the response, but may also be the thread that processes the request - if the request takes more time than the response to be processed) is used to dequeue the next request for the same destination and processes it.
A simple asynchronous GET request that discards the response content can be written in this way:
Request.send(Response.CompleteListener) returns void and does not block; the
Response.CompleteListener provided as a parameter is notified when the request/response conversation is
complete, and the
Result parameter allows you to access the response object.
You can write the same code using JDK 8′s lambda expressions:
You can impose a total timeout for the request/response conversation in the same way used by the synchronous API:
The example above will impose a total timeout of 3 seconds on the request/response conversation.
The HTTP client APIs use listeners extensively to provide hooks for all possible request and response events, and with JDK 8′s lambda expressions they’re even more fun to use:
This makes Jetty HTTP client suitable for HTTP load testing because, for example, you can accurately time every step of the request/response conversation (thus knowing where the request/response time is really spent).
Jetty HTTP client provides a number of utility classes off the shelf to handle request content.
You can provide request content as
InputStream, and provide your own implementation of
org.eclipse.jetty.client.api.ContentProvider. Here’s an example that provides the
request content using
This is equivalent to using the
PathContentProvider utility class:
Alternatively, you can use
FileInputStream via the
InputStreamContentProvider utility class:
InputStream is blocking, then also the send of the request will block if the
input stream blocks, even in case of usage of the asynchronous
If you have already read the content in memory, you can pass it as a
byte using the
BytesContentProvider utility class:
If the request content is not immediately available, but your application will be notified of the
content to send, you can use
DeferredContentProvider in this way:
While the request content is awaited and consequently uploaded by the client application, the server
may be able to respond (at least with the response headers) completely asynchronously.
In this case,
Response.Listener callbacks will be invoked before the request is fully sent.
This allows fine-grained control of the request/response conversation: for example the server may reject
contents that are too big, send a response to the client, which in turn may stop the content upload.
Another way to provide request content is by using an
allows applications to write request content when it is available to the
OutputStream provided by
Jetty HTTP client allows applications to handle response content in different ways.
The first way is to buffer the response content in memory; this is done when using the blocking APIs (see
Blocking APIs) and the content is buffered within a
ContentResponse up to
If you want to control the length of the response content (for example limiting to values smaller than the
default of 2 MiB), then you can use a
org.eclipse.jetty.client.util.FutureResponseListenerin this way:
If the response content length is exceeded, the response will be aborted, and an exception will be thrown
If you are using the asynchronous APIs (see Asynchronous APIs), you can use the
BufferingResponseListener utility class:
The second way is the most efficient (because it avoids content copies) and allows you to specify a
Response.ContentListener, or a subclass, to handle the content as soon as it arrives:
The third way allows you to wait for the response and then stream the content using the
InputStreamResponseListener utility class: