如何在高使用率的多线程环境中以最佳方式清理传出的 UrlConnection 对象?

发布于 2024-08-02 12:40:15 字数 1033 浏览 8 评论 0原文

我有一个用例,其中 servlet(具有相当高的并发使用)创建传出 URLConnection 来检索一些数据 REST 样式,作为其正常服务器端处理逻辑的一部分。每次调用 servlet 时都会创建并使用连接,因为 URL 可能不同(但域始终相同)。我想确保它尽可能以最佳方式执行此操作,以便端口和连接保持打开状态的时间不会超过它们在应用程序服务器上所需的时间,但会在适用的情况下重新使用。

Javadocs 似乎有点模糊 - 关于 URLConnection:

'在请求后调用 URLConnection 的 InputStream 或 OutputStream 上的 close() 方法可能会释放与此实例关联的网络资源,除非特定的协议规范为其指定不同的行为。 '

在 HttpURLConnection 上:

'每个 HttpURLConnection 实例都用于发出单个请求,但到 HTTP 服务器的底层网络连接可以由其他实例透明地共享。在请求后调用 HttpURLConnection 的 InputStream 或 OutputStream 上的 close() 方法可能会释放与此实例关联的网络资源,但对任何共享持久连接没有影响。如果持久连接当时处于空闲状态,则调用 connect() 方法可能会关闭底层套接字。'

目前,正在使用 URLConnection 并且仅关闭输入流,如下面的代码所示(错误处理和 URL阅读被删除,因为它们与问题无关)。我的想法是,这将清理流资源,但如果可能的话,允许重用底层套接字(因为请求始终是同一域,具有不同的 URL 路径)。任何有关如何进一步优化的建议将不胜感激。

URL requestUrl = new URL(location);
URLConnection urlConnection = requestUrl.openConnection();
BufferedReader br = new BufferedReader(new InputStreamReader(urlConnection.getInputStream(), "UTF-8"));
//reading code here
  br.close();

I have a use case where a servlet (that has reasonably high concurrent use) makes an outgoing URLConnection to retrieve some data REST style as part of its normal server-side processing logic. The connection is created and used each time the servlet is invoked as the URL is potentially different (but the domain is always the same). I want to ensure that it does this as optimally as possible, so that ports and connections are not held open longer than they need to be on the application server, but are re-used if applicable.

The Javadocs seem a bit vague - on URLConnection:

'Invoking the close() methods on the InputStream or OutputStream of an URLConnection after a request may free network resources associated with this instance, unless particular protocol specifications specify different behaviours for it.'

On HttpURLConnection:

'Each HttpURLConnection instance is used to make a single request but the underlying network connection to the HTTP server may be transparently shared by other instances. Calling the close() methods on the InputStream or OutputStream of an HttpURLConnection after a request may free network resources associated with this instance but has no effect on any shared persistent connection. Calling the disconnect() method may close the underlying socket if a persistent connection is otherwise idle at that time.'

Currently, a URLConnection is being used and the inputstream closed only, as per the code below (error handling and URL reading removed as they are not relevant to the question). My thought is this will clean up stream resources but allow re-use of the underlying socket if possible (since the request is always to the same domain, with different URL paths). Any advice on how to further optimize would be appreciated.

URL requestUrl = new URL(location);
URLConnection urlConnection = requestUrl.openConnection();
BufferedReader br = new BufferedReader(new InputStreamReader(urlConnection.getInputStream(), "UTF-8"));
//reading code here
  br.close();

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

泪是无色的血 2024-08-09 12:40:15

您可以考虑使用 Apache 的 HttpClient。我们在一个应用程序中使用它,该应用程序每天发送请求数百万次,并在少数系统上进行负载平衡。我们使用了 HttpClient 对象池,我不确定这是否必要,因为我们没有在调用之间保持连接打开,但代码早于我在这里的时间,也许他们找到了原因。

You might consider using Apache's HttpClient. We use it in an app that sends requests, literally, a couple million times a day load balanced over a handful of systems. We've use a pool of the HttpClient objects, which I'm not sure is necessary since we don't keep the connections open between calls, but the code predates my time here and maybe they found a reason for it.

吃颗糖壮壮胆 2024-08-09 12:40:15

我认为您所拥有的与使用 Java 实现所获得的一样好。如果速度是一个很大的问题,并且 Java 的实现占用内存或太慢,请编写您自己的连接类。

I think what you have is as good as its going to get using Java's implementation. If speed is a huge concern and Java's implementation is hogging memory or is too slow, write your own connection class.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文