使用 HTTPS 进行 Web 服务或 Web 请求时的内存使用问题

发布于 2024-12-25 23:36:56 字数 2758 浏览 0 评论 0原文

编辑:经过评论中的长时间讨论,似乎我原来的问题并没有真正抓住正在发生的事情。以下是我现在所处情况的总结:

  • 使用 HTTPS 时,当第一个 HTTPS 请求(Web 服务调用、WebClient. DownloadFile() 等)同时
  • ,CPU 核心的使用率也接近 100%。这通常只持续几秒钟,但我见过它持续更长时间。
  • 很可能这只是使用 HTTPS 的成本,但我对此感到惊讶,因为我以前从未在其他应用程序中注意到它(并且我团队中的其他开发人员从未在这个应用程序中注意到这一点,该应用程序早在我加入之前就一直在使用 HTTPS)。
  • 更关键的是:这似乎并不是在所有机器上都会发生,但在大多数机器上都会发生。如果这种情况在每台机器上都持续发生,我会更愿意接受它作为“做生意的成本”。但由于运行相同代码和操作系统的某些机器之间似乎确实存在差异,我想了解为什么会出现这种情况,因为它要么a)允许我们减轻行为,要么b)以满足的方式解释它非技术高层认为这实际上并不是一个“问题”,因为解释说 Windows 任务管理器显示虚拟内存,而不一定是主动使用的物理内存,到目前为止还不够令人满意:/

我已经离开了原来的帖子如果有人有兴趣,请完整保存在下面,但它更多地关注网络服务,这并不是问题的根源。

预先感谢您提供任何进一步的见解!


每当我们的应用程序首次使用 https 调用我们的 Web 服务时,我们就会看到内存使用量急剧增加。具体情况因机器而异,但作为一个例子,当第一次进行 Web 服务调用时,我们的应用程序可能会从大约 50mb 跃升至超过 250mb,并且使用量永远不会回落。后续调用不会导致另一次此类跳转。我可以使用下面的代码(不是特定于我们的应用程序)和我们不拥有的公共 Web 服务来重现该行为 - 因此它似乎独立于我们的客户端和服务器端代码。

有趣的是,在我的测试应用程序中,我在 Windows XP 上没有观察到这种跳跃(我们的应用程序当前仅部署在 Windows 7 上)。我们也没有在办公室的每台开发/测试机器上看到它(但我们在大多数机器上看到),并且我们目前没有办法从“现实世界”中的机器检索此信息。

我无法确定分配的内容,但几个分析器已经明确表示它驻留在本机(非托管)内存中。使用 DebugDiag 对一些 WinDbg 转储的分析让我相信,在 crypt32.dll 中分配了很多内存,但没有被释放。这在某种程度上是有道理的(https 意味着证书、安全性等,并且很可能正在加载的任何内容都会被缓存,因此为什么后续调用不会导致额外的跳转),但我很难相信这是实际上只是使用 https 进行 Web 服务的成本。

我知道会有一些来自“如果更高的内存使用量不会导致问题,为什么要担心?”的回应。营。总的来说,我同意 - 任务管理器中的内存使用数字通常并不表明应用程序是否按预期工作。如果该应用程序严格在内部使用,只要它不是其他问题的症状,我就可以忍受。但我们的应用程序是在消费者机器上部署的,因此我们必须像担心实际问题一样担心问题的感知。因此,如果有任何方法可以解决此问题,我将不胜感激!

最后,我在下面的测试代码中使用的 Web 服务可以在这里找到: http:// ws.cdyne.com/emailverify/Emailvernotestemail.asmx?wsdl。 EmailVerNoTestEmail 的代码是使用 wsdl.exe 工具生成的,稍作修改,将 URL 作为参数传递给构造函数,而不是对其进行硬编码(以便可以即时指定 http/https)。

public static void Main(string[] args)
{
    const string urlSuffix = "://ws.cdyne.com/emailverify/Emailvernotestemail.asmx";
    string protocol = null;
    while(protocol == null)
    {
        Console.Write("Enter protocol (http, https): ");
        var line = Console.ReadLine();
        if (line != null) line = line.ToLower();
        if (line == "http" || line == "https")
            protocol = line.Trim();
    }
    var url = protocol + urlSuffix;
    Console.WriteLine("Using URL: " + url);
    Console.Out.Flush();

    var service = new EmailVerNoTestEmail(url);

    Console.WriteLine("Press any key to make the web service call...");
    Console.ReadKey(true);

    Console.WriteLine("Calling web service...");
    var resp = service.VerifyEmail("[email protected]", "test");
    Console.WriteLine("Response: " + resp);

    Console.WriteLine("Press any key to exit.");
    Console.ReadKey(true);
}

EDIT: After a long discussion in the comments, it seems that my original question didn't really capture what was going on. Here's a summary of where I am now:

  • When using HTTPS, there's what I would consider a dramatic jump in the virtual memory space of my application (typically from under 50mb to over 200mb) when the first HTTPS request (web service call, WebClient.DownloadFile(), etc.) is made
  • At the same time, a CPU core also moves to nearly 100% usage. This typically only lasts a few seconds, but I have seen it last longer
  • It may very well be the case that this is just the cost of using HTTPS, but I was surprised by it since I had never noticed it before in other apps (and other developers on my team never noticed it in this app, which has been using HTTPS since long before I came on board).
  • The kicker: this doesn't appear to happen on all machines, but does on most. If it happened consistently on every machine, I'd be more willing to accept it as a "cost of doing business." But since there does seem to be a difference between some machines running the same code and OS, I would like to understand why that is since it will either a) allow us to mitigate the behavior, or b) explain it in a way that satisfies non-technical higher-ups that it's not actually a "problem", as explaining that Windows Task Manager shows virtual memory and not necessarily actively-in-use physical memory hasn't been satisfactory so far :/

I've left the original post intact below in case anyone is interested, but it focuses more on web services, which aren't really at the root of the problem.

Thanks in advance for any further insight!


We're seeing memory usage increase dramatically whenever our application first makes a call to our web service using https. The specifics vary by machine, but as an example we may see our application jump from ~50mb to over 250mb when the first web service call is made, and the usage never climbs back down. Subsequent calls do not result in another such jump. I can reproduce the behavior with the code below (not specific to our application) and a public web service that we do not own - so it seems to be independent of both our client- and server-side code.

Interestingly, in my test app I don't observe this jump on Windows XP (our application is currently deployed only on Windows 7). We also don't see it on every dev/test machine in the office (but we do on most), and we don't currently have a way to retrieve this info from machines out in the "real world."

I haven't been able to pin down what's being allocated, but several profilers have made it clear that it resides in native (not managed) memory. Analysis of some WinDbg dumps using DebugDiag leaves me to believe that there is a lot of memory getting allocated in crypt32.dll that isn't being released. This makes sense to some extent (https implies certificates, security, etc., and it's likely that whatever is being loaded is getting cached, hence why subsequent calls don't result in additional jumps), but I have a hard time believing this is really just the cost of using https for a web service.

I know there will be some responses from the "if higher memory usage isn't causing problems, why worry?" camp. In general I agree - the memory usage numbers in Task Manager often aren't indicative of whether the app is working as intended. If the app was used strictly in-house, I could live with this as long as it wasn't a symptom of other problems. But our app is deployed with consumer machines, so we have to worry about the perception of a problem just as much as the actual problem. So if there's any way to fix this, I would greatly appreciate it!

Finally, the web service that I use in the test code below is available here: http://ws.cdyne.com/emailverify/Emailvernotestemail.asmx?wsdl. The code for EmailVerNoTestEmail was generated using the wsdl.exe tool, with the slight modification of passing the URL as a parameter to the constructor rather than hard-coding it (so that http/https can be specified on the fly).

public static void Main(string[] args)
{
    const string urlSuffix = "://ws.cdyne.com/emailverify/Emailvernotestemail.asmx";
    string protocol = null;
    while(protocol == null)
    {
        Console.Write("Enter protocol (http, https): ");
        var line = Console.ReadLine();
        if (line != null) line = line.ToLower();
        if (line == "http" || line == "https")
            protocol = line.Trim();
    }
    var url = protocol + urlSuffix;
    Console.WriteLine("Using URL: " + url);
    Console.Out.Flush();

    var service = new EmailVerNoTestEmail(url);

    Console.WriteLine("Press any key to make the web service call...");
    Console.ReadKey(true);

    Console.WriteLine("Calling web service...");
    var resp = service.VerifyEmail("[email protected]", "test");
    Console.WriteLine("Response: " + resp);

    Console.WriteLine("Press any key to exit.");
    Console.ReadKey(true);
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

感受沵的脚步 2025-01-01 23:36:56

除非系统受到压力,否则它很自然地会保持在一定的内存水平。操作系统具有启发式方法,可以确定需要多少内存,并分配给高于所需内存的最高点。为什么?内存的分配和释放是昂贵的,如果一个程序可以在给定的自己的沙箱中运行,那么如果系统没有压力,为什么要减少它呢?

正如您提到的,问题不是分配的内存量,而是内存泄漏。我建议您运行 perfmon 并监视泄漏;阅读这篇 MSDN CLR 文章调查内存问题。

编辑:(这是我之前给出的建议,但可能与主题重复相关/抱歉

要查看内存泄漏的迹象,请启动 perfmon 并查看 Private perfmon 中的字节表示这些标志。请参阅识别并防止托管代码中的内存泄漏开始那个过程。

另一种要使用的进程位于 Windows 任务管理器的“进程”选项卡上。 (查看+选择列)检查句柄、GDI 对象和 USER 对象。请在您的程序中遵守这些值。如果存在手柄泄漏,您会看到其中之一稳步上升。在这些情况下,GDI 很可能会出现。

一般来说,当操作系统运行应用程序时,它会尝试确定正在使用多少内存。它将为应用程序分配应用程序当前需要的更多内存。原因是,分配/释放内存是一个 CPU 密集型操作。当应用程序可以驻留在内存池中时,为什么要并行应用程序所需的内容,这将允许应用程序在不受操作系统干扰的情况下扩展和收缩。节省周期。

开发人员看到的是内存的高水位线。如果系统没有压力,系统将不会回收任何内存并保持高水位线。这个论坛中有很多帖子,用户说,处理完成,内存清理,但操作系统仍然在内存点 X 显示我的应用程序,而它应该是内存点 M(下部)。 Winform 用户报告了相同的情况,但如果最小化应用程序,报告的内存使用量会突然下降到 M 级别。这是通过设计完成的。最小化表明应用程序不需要内存,因为它不会与用户交互,并且操作系统将回收内存。如果它不是 winform 并且操作系统没有受到压力,则该应用程序将保持在 X 的高水位线。

Unless a system is stressed, it is natural for it to remain at a certain memory level. The OS has heuristics which determine how much memory is needed and allocates to that high point greater than what is needed. Why? It is expensive for allocation and deallocation of memory and if a program can run in its own sandbox which was given, why decrease it if the system is not stressed.

As you mentioned it is not the amount of memory allocated which is a concern, but it is the leaking of memory. I suggest you run perfmon and monitor for leaks; read up on the things to look out for in this MSDN CLR article Investigating Memory Issues.

EDIT: (This is advice I have given before but may be relevant/ sorry for topic duplication)

To see the telltale sign of memory leaks one fires up perfmon and looks at Private Bytes in perfmon for that those signs. See Identify And Prevent Memory Leaks In Managed Code to begin that process.

One other process to use is on the Processes tab of the Windows Task Manager. ( View + Select Columns,) check Handles, GDI Objects and USER Objects. Observe these values for your program. If there is t a handle leak, you'll see one of these steadily climbing if you do. GDI in all likelihood under those scenarios.

In general when the OS runs an application it tries to determine how much memory is being used. It will allocate more memory for the application that what the application currently needs. The reason for that is, to allocate/deallocate memory is a cpu intensive operation. Why parallel what the app needs when it can reside in a pool of memory which will allow it, the app, to expand and contract with no interference from the operating system. Saves on cycles.

What the developer sees is a high water mark for memory. If the system is not stressed, the system will not reclaim any memory and keep the high water mark. There are many posts in this forum where the users say, processing done, memory cleaned up but the OS still shows my app at memory point X, when it should be memory point M (lower). Winform users report the same but if one minimizes the app, suddenly the mem usage reported drops to that M level. That is done by design. Minimizing suggests that an app doesn't need the memory for it will not be interacting with the user and the OS will reclaim the point. If its not a winform and the OS is not stressed, the app stays at the high water mark of X.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文