使用 Copyfile() 从不同计算机复制文件的性能问题
使用 VC++ VisualStudio 2003。
我试图将多个图像文件(每个文件 30kb 左右)从另一台计算机的共享文件夹复制到本地文件。
问题是一次传输可能有超过2000个左右的文件,而且看起来 造成损失,实质上需要更多时间才能完成。
是否有其他方法可以从另一台计算机复制文件 加快复制速度?
提前致谢。
编辑* 由于客户请求,不可能大幅更改代码库, 不愿意因为非技术问题而偏离最佳实践, 但还有更微妙的方法吗? 比如另一个函数调用?
我知道我正在寻求一些神奇的巫术,只是为了以防万一有人知道这一点。
Using VC++ VisualStudio 2003.
I'm trying to copy several image files (30kb or so per file) from another computer`s shared folder to a local file.
The problem is that there can be more than 2000 or so files in one transfer, and it seems
to take its toll, substantially taking more time to complete.
Is there any alternate method of copying files from another computer that could possibly
speed up the copy?
Thanks in advance.
EDIT*
Due to client request, it is not possible to change the code base dramaticaly,
hate to have to deviate from best practice because of non-techinical issues,
but is there a more subtle approuch? such as another function call?
I know I`m asking for some magical voodoo, asking just in case somebody knows of such.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
需要尝试一些事情:
使用操作系统复制文件是否更快?
如果不是,那么您的网络或其设置方式可能存在一些固有的限制(可能是身份验证问题,或者远程服务器有一些硬件问题,或者太忙,或者网卡由于以下原因丢失了太多数据包)冲突,开关故障,接线不良...)
进行一些传输各种大小的文件的测试。
小文件的传输速度总是较慢,因为获取其详细信息、然后传输数据、然后创建目录条目等会产生大量开销。
如果大文件速度很快,那么您的网络就可以,而您可能不行能够极大地改进系统(瓶颈在其他地方)。
最终,您可以尝试从代码中一次打开文件并将其读入一个大缓冲区,然后将它们保存在本地驱动器上。 这可能会更快,因为您将绕过操作系统内部执行的大量检查。
您甚至可以通过几个线程来同时打开、加载、写入文件,以加快速度。
您可以检查多线程文件复制的几个参考:
如果自己在代码中实现这一点太麻烦,您总是可以简单地在应用程序后台执行像 McTool 这样的实用程序,并让它为您完成工作。
A few things to try:
is copying files using the OS any faster?
if no, then there may be some inherent limitations to your network or the way it's setup (maybe authentication troubles, or the distant server has some hardware issues, or it's too busy, or the network card loses too many packets because of collisions, faulty switch, bad wiring...)
make some tests transferring files of various sizes.
Small files are always slower to transfer because there is a lot of overhead to fetch their details, then transfer the data, then create directory entries etc.
if large files are fast, then your network is OK and you're probably not be able to improve the system much (the bottleneck is elsewhere).
Eventually, from code, you could try to open, read the files into a large buffer in one go then save them on the local drive. This may be faster as you'll be bypassing a lot of checks that the OS does internally.
You could even do this over a few threads to open, load, write files concurrently to speed things up a bit.
A couple of references you can check for mutli-threaded file copy:
If implementing this yourself in code is too much trouble, you could always simply execute a utility like McTool in the background of your application and let it do the work for you.
嗯,首先,2000 年不是几个。 如果由于您发送大量小文件而花费了大部分时间,那么您会想出一个解决方案,将它们在源处打包成单个文件并在目标处解包。 这将需要在源代码中运行一些代码 - 您必须设计您的解决方案以允许这样做,因为我假设目前您只是从网络共享进行复制。
如果是网络速度(不太可能),您也可以压缩它们。
我自己的信念是,这将是文件的数量,基本上是副本的所有重复启动成本。 这是因为 2000 个 30K 文件只有 60MB,而在 10Mb 链路上,理论上最短时间约为一分钟。
如果您的时间远高于此,那么我会说我是对的。
使用 7zip 或类似的解决方案将它们全部压缩到一个
7z
文件中,传输它们,然后在另一端解压缩它们,听起来就像您正在寻找的那样。但要测量,不要猜测! 测试一下它是否可以提高性能。 然后做出决定。
Well, for a start, 2000 is not several. If it's taking most of the time because you're sending lots of small files, then you come up with a solution that packages them at the source into a single file and unpackages them at the destination. This will require some code running at the source - you'll have to design your solution to allow that since I assume at the moment you're just copying from a network share.
If it's the network speed (unlikely), you compress them as well.
My own beliefs are that it will be the number of files, basically all the repeated startup costs of a copy. That's because 2000 30K files is only 60MB, and on a 10Mb link, theoretical minimum time would be about a minute.
If your times are substantially above that, then I'd say I'm right.
A solution that uses 7zip or similar to compress them all to a single
7z
file, transmit them, then unzip them at the other end sounds like what you're looking for.But measure, don't guess! Test it out to see if it improves performance. Then make a decision.