是否可以合理地解决防病毒扫描工作目录的问题?

发布于 2024-07-23 16:22:01 字数 775 浏览 6 评论 0原文

我的 Win32 应用程序在运行时在指定的临时文件夹中执行大量磁盘操作,并且认真地重新设计它是不可能的。

某些客户端具有扫描同一临时目录的防病毒软件(它只是扫描所有内容)。 我们试图说服他们禁用它 - 它不起作用,所以它也是不可能的。

每隔一段时间(例如每一千个文件操作一次)我的应用程序会尝试对当时由防病毒软件打开并因此被操作系统锁定的文件执行操作。 发生共享冲突并导致我的应用程序出现错误。 这种情况平均每三分钟发生一次。

在大多数典型情况下,临时文件夹最多可以包含 100k 个文件,因此我不喜欢始终打开它们的想法,因为这可能会导致在某些边缘条件下耗尽资源。

当所需文件被锁定时,我的应用程序是否有一些合理的策略来应对情况? 也许是这样的?

for( int i = 0; i < ReasonableNumber; i++ ) {
    try {
        performOperation(); // do useful stuff here
        break;
    } catch( ... ) {
        if( i == ReasonableNumber - 1 ) {
            throw; //not to hide errors if unlock never happens
        }
    }
    Sleep( ReasonableInterval );
 }

这是一个可行的策略吗? 如果是这样,我的应用程序应该重试多少次和多久? 如果有的话,有哪些更好的想法?

My Win32 application performs numerous disk operations in a designated temporary folder while functioning, and seriously redesigning it is out of the question.

Some clients have antivirus software that scans the same temporary directory (it simply scans everything). We tried to talk them into disabling it - it doesn't work, so it's out of the question either.

Every once in a while (something like once for every one thousand file operations) my application tries to perform an operation on a file which is at that very time opened by the antivirus and is therefore locked by the operating system. A sharing violation occurs and causes an error in my application. This happens about once in three minutes on average.

The temporary folder can contain up to 100k files in most typical scenarios, so I don't like the idea of having them open at all times because this could cause running out of resources on some edge conditions.

Is there some reasonable strategy for my application to react to situations when a needed file is locked? Maybe something like this?

for( int i = 0; i < ReasonableNumber; i++ ) {
    try {
        performOperation(); // do useful stuff here
        break;
    } catch( ... ) {
        if( i == ReasonableNumber - 1 ) {
            throw; //not to hide errors if unlock never happens
        }
    }
    Sleep( ReasonableInterval );
 }

Is this a viable strategy? If so, how many times and how often should my application retry? What are better ideas if any?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

2024-07-30 16:22:01

在扫描文件时锁定文件的病毒扫描程序非常糟糕。 拥有如此糟糕的病毒扫描程序的客户需要更换他们的大脑......;-)

好吧,足够的咆哮。 如果文件被其他进程锁定,那么您可以像您建议的那样使用“重试”策略。 OTOH,您真的需要关闭然后重新打开这些文件吗? 在您的流程完成之前,您不能让它们保持打开状态吗?
提示:当您尝试再次重新打开文件时添加延迟(睡眠)。 大约 100 毫秒应该足够了。 如果病毒扫描程序将文件保持打开状态那么长时间,那么它就是一个真正糟糕的扫描程序。 使用不良扫描仪的客户应该看到他们会看到的异常消息。
通常,最多尝试 3 次... -> 打开,失败时重试,第二次失败时重试,第三次失败时崩溃。

记住以用户友好的方式崩溃。

A virusscanner that locks files while it's scanning them is quite bad. Clients who have virusscanners this bad need to have their brains replaced... ;-)

Okay, enough ranting. If a file is locked by some other process then you can use a "try again" strategy like you suggest. OTOH, do you really need to close and then re-open those files? Can't you keep them open until your process is done?
One tip: Add a delay (sleep) when you try to re-open the file again. About 100 ms should be enough. If the virusscanner keeps the file open that long then it's a real bad scanner. Clients with scanners that bad deserve the exception message that they'll see.
Typically, try up to three times... -> Open, on failure try again, on second failure try again, on third failure just crash.

Remember to crash in a user-friendly way.

老街孤人 2024-07-30 16:22:01

我曾使用过 Symantec 和 AVG 制作的防病毒软件,导致文件无法打开。

我们在 2002 年与 Symantec 合作时遇到的一个常见问题是 MSDev6,文件按以下顺序更新:

  1. 打开文件
  2. 内容在内存中修改
  3. 应用程序需要提交更改
  4. 应用程序使用新文件副本创建新的 tmp 文件 +更改
  5. 应用程序删除旧文件
  6. 应用程序将 tmp 文件复制到旧文件名
  7. 应用程序删除 tmp 文件

问题将出现在步骤 5 和步骤 6 之间。赛门铁克将采取措施减慢删除速度,防止创建同名文件(CreateFile 返回ERROR_DELETE_PENDING)。 MSDev6 不会注意到这一点 - 这意味着步骤 6 失败。 但第 7 步仍然发生了。 原件的删除最终将完成。 所以该文件不再存在于磁盘上!

使用 AVG,我们在打开刚刚修改的文件时遇到了间歇性问题。

我们的解决方案是在问题中的合理循环中尝试/捕获。 我们的循环计数是 5。

I've had experience with antivirus software made by both Symantec and AVG which resulted in files being unavailable for open.

A common problem we experienced back in the 2002 time frame with Symantec was with MSDev6 when a file was updated in this sequence:

  1. a file is opened
  2. contents are modified in memory
  3. application needs to commit changes
  4. application creates new tmp file with new copy of file + changes
  5. application deletes old file
  6. application copies tmp file to old file name
  7. application deletes the tmp file

The problem would occur between step 5 and step 6. Symantec would do something to slowdown the delete preventing the creation of a file with the same name (CreateFile returned ERROR_DELETE_PENDING). MSDev6 would fail to notice that - meaning step 6 failed. Step 7 still happened though. The delete of the original would eventually finish. So the file no longer existed on disk!

With AVG, we've been experiencing intermittent problems being able to open files that have just been modified.

Our resolution was a try/catch in a reasonable loop as in the question. Our loop count is 5.

笨笨の傻瓜 2024-07-30 16:22:01

如果其他进程(无论是防病毒软件、备份实用程序甚至用户本身)有可能打开该文件,那么您必须针对这种可能性进行编码。

您的解决方案虽然可能不是最优雅的,但只要 ReasonableNumber 足够大,就肯定会起作用 - 过去我使用 10 作为合理数字。 我当然不会再高了,你也可以选择较低的值,例如 5。

睡眠的值? 最多 100 毫秒或 200 毫秒

请记住,大多数时候您的应用程序无论如何都会第一次获取文件。

If there is the possibility that some other process - be it the antivirus software, a backup utility or even the user themselves - can open the file, then you must code for that possibility.

Your solution, while perhaps not the most elegant, will certainly work as long as ReasonableNumber is sufficiently large - in the past I've used 10 as the reasonable number. I certainly wouldn't go any higher and you could get away with a lower value such as 5.

The value of sleep? 100ms or 200ms at most

Bear in mind that most of the time your application will get the file first time anyway.

行雁书 2024-07-30 16:22:01

取决于您的文件有多大,但对于 10 到 100 Kb 的文件,我发现 5 次尝试和 100 毫秒(0.1 秒)就足够了。 如果您仍然偶尔遇到错误,请加倍等待,但是 YMMV

如果您的代码中有几个地方需要执行此操作,我可以建议采用函数式方法:

using System;

namespace Retry
{
    class Program
    {
        static void Main(string[] args)
        {
            int i = 0;
            Utils.Retry(() =>
            {
                i = i + 1;
                if (i < 3)
                    throw new ArgumentOutOfRangeException();
            });
            Console.WriteLine(i);
            Console.Write("Press any key...");
            Console.ReadKey();
        }
    }

    class Utils
    {
        public delegate void Retryable();
        static int RETRIES = 5;
        static int WAIT = 100; /*ms*/
        static public void Retry( Retryable retryable )
        {
            int retrys = RETRIES;
            int wait = WAIT;
            Exception err;
            do
            {
                try
                {
                    err = null;
                    retryable();
                }
                catch (Exception e)
                {
                    err = e;
                    if (retrys != 1)
                    {
                        System.Threading.Thread.Sleep(wait);
                        wait *= 2;
                    }
                }
            } while( --retrys > 0 && err != null );
            if (err != null)
                throw err;
        }
    }
}

Depends on how big your files are, but for 10s to 100s of Kb I find that 5 trys with 100ms (0.1 seconds) to be sufficient. If you still hit the error once in a while, double the wait, but YMMV.

If you have a few places in the code which needs to do this, may I suggest taking a functional approach:

using System;

namespace Retry
{
    class Program
    {
        static void Main(string[] args)
        {
            int i = 0;
            Utils.Retry(() =>
            {
                i = i + 1;
                if (i < 3)
                    throw new ArgumentOutOfRangeException();
            });
            Console.WriteLine(i);
            Console.Write("Press any key...");
            Console.ReadKey();
        }
    }

    class Utils
    {
        public delegate void Retryable();
        static int RETRIES = 5;
        static int WAIT = 100; /*ms*/
        static public void Retry( Retryable retryable )
        {
            int retrys = RETRIES;
            int wait = WAIT;
            Exception err;
            do
            {
                try
                {
                    err = null;
                    retryable();
                }
                catch (Exception e)
                {
                    err = e;
                    if (retrys != 1)
                    {
                        System.Threading.Thread.Sleep(wait);
                        wait *= 2;
                    }
                }
            } while( --retrys > 0 && err != null );
            if (err != null)
                throw err;
        }
    }
}
骄兵必败 2024-07-30 16:22:01

您可以更改您的应用程序以便不释放文件句柄吗? 如果您自己锁定文件,antivir 应用程序将无法扫描它。

否则,像您这样的策略会有所帮助,因为它只会降低概率,但不能解决问题。

Could you change your application so you don't release the file handle? If you hold a lock on the file yourself the antivir application will not be able to scan it.

Otherwise a strategy such as yours will help, a bit, because it only reduces the probability but it doesn't solve the problem.

柳絮泡泡 2024-07-30 16:22:01

棘手的问题。 我的大多数想法都进入了你不想要的方向(例如重新设计)。

我不知道您的目录中有多少个文件,但如果没有那么多,您可以通过在程序运行时保持所有文件打开和锁定来解决您的问题。

这样病毒扫描程序就没有机会再中断您的文件访问。

Tough problem. Most ideas that I have go into a direction that you don't want (e.g. redesign).

I don't know how many files you have in your directory, but if it's not that much you may be able to work around your problem by keeping all files open and locked while your program runs.

That way the virus scanner will have no chance to interrupt your file-accesses anymore.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文