在 .NET (C#) 中测试文件复制速度时防止缓存
我正在尝试对我们的 WAN 上的复制速度进行一些测试。正如我有点怀疑的那样,使用 File.Copy(source, dest) .NET 函数似乎在第二次和后续运行中变得更快。我怀疑我的公司网络正在做一些狡猾的缓存,或者 Windows 正在做一些狡猾的缓存。
避免这种情况发生的风险的最佳方法是什么?每次将源文件重命名为随机字符串是否明智,或者是否有更聪明的方法来规避它?
I'm trying to do some tests of copying speed on our WAN. As I'd somewhat suspected, using the File.Copy(source, dest) .NET function seems to get faster on the 2nd and subsequent run. I suspect either my corporate network is doing some crafty caching, or windows is.
What's the best way to avoid the risk of this happening? Would renaming the source file to a random string each time be sensible, or is there a cleverer way to circumvent it?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我想我会关闭这个。我认为也许最好的方法是生成一个随机文件(执行类似以下操作: 创建C# 中的随机文件)并传输它。
我还发现缓存主要只影响本地复制,这比我试图测量的网络复制要少一些。
I think I'll close this. I think perhaps the best way is to generate a random file (doing something like: Creating a Random File in C#) and transfer that.
I also found the caching mainly only affected local copying, which was less of a concern than the network ones I was trying to measure.
我相信这是远程系统上工作的文件缓存。当第一次请求文件时,文件缓存机制会将该文件缓存在 RAM 中,以应对未来的请求,并为 RAM 中的后续请求提供服务。这只会减少从本地存储读取文件并开始提供服务所需的时间,而不是在两个系统之间传输文件的时间。
通常,公司会部署缓存盒来通过基于网络的资源(针对内联网和互联网)提供文件服务,我不知道有任何缓存盒机制可以用于文件共享。
I believe it is the file caching working on the remote system. When a file is requested the first time file caching mechanism caches that file in the RAM in anticipation of future requests and serves the subsequent requests from RAM. This only reduces the time taken for reading the file from the local storage and start serving them and not the transfer of them between the 2 systems.
Normally corporations deploy cache boxes for serving files over the web based resources(for intranet and internet), I am not aware of any cache box mechanism for doing it for file shares.