“复制失败:文件太大” Perl 中的错误

发布于 2024-09-09 16:52:03 字数 1511 浏览 3 评论 0原文

好的,我的文件夹中有 650 万张图像,我需要尽快移动它们。我将把它们移动到自己的文件夹结构中,但首先我必须将它们移出该服务器。

我尝试了 rsync 和 cp 以及各种其他工具,但它们总是最终出错。所以我写了一个perl脚本来以更直接的方法提取信息。使用 opendir 并让它计算所有文件的数量是完美的。它可以在大约 10 秒内将它们全部数完。现在,我尝试将我的脚本再提高一个档次,并让它实际移动文件,但我收到错误“文件太大”。这一定是某种错误,因为文件本身都相当小。

#!/usr/bin/perl
#############################################
# CopyFilesLite
# Russell Perkins
# 7/12/2010
#
# Tool is used to copy millions of files
# while using as little memory as possible. 
#############################################

use strict;
use warnings;
use File::Copy;

#dir1, dir2 passed from command line
my $dir1 = shift;
my $dir2 = shift;
#Varibles to keep count of things
my $count = 0;
my $cnt_FileExsists = 0;
my $cnt_FileCopied = 0;

#simple error checking and validation
die "Usage: $0 directory1 directory2\n" unless defined $dir2;
die "Not a directory: $dir1\n" unless -d $dir1;
die "Not a directory: $dir2\n" unless -d $dir2;

opendir DIR, "$dir1" or die "Could not open $dir1: $!\n";
while (my $file = readdir DIR){
  if (-e $dir2 .'/' . $file){
   #print $file . " exsists in " . $dir2 . "\n"; #debuging 
   $cnt_FileExsists++;
  }else{
   copy($dir1 . '/' . $file,$dir2 . '/' . $file) or die "Copy failed: $!";
   $cnt_FileCopied++;
   #print $file . " does not exsists in " . $dir2 . "\n"; #debuging 
  }
  $count++;
}
closedir DIR;

#ToDo: Clean up output. 
print "Total files: $count\nFiles not copied: $cnt_FileExsists\nFiles Copied: $cnt_FileCopied\n\n";

那么你们中有人以前遇到过这种情况吗?什么会导致这种情况以及如何解决?

Ok so i have 6.5 Million images in a folder and I need to get them moved asap. I will be moving them into their own folder structure but first I must get them moved off this server.

I tried rsync and cp and all sorts of other tools but they always end up erroring out. So i wrote a perl script to pull the information in a more direct method. Using opendir and having it count all the files works perfect. It can count them all in about 10 seconds. Now I try to just step my script up one more notch and have it actually move the files and I get the error "File too large". This must be some sort of false error as the files themselves are all fairly small.

#!/usr/bin/perl
#############################################
# CopyFilesLite
# Russell Perkins
# 7/12/2010
#
# Tool is used to copy millions of files
# while using as little memory as possible. 
#############################################

use strict;
use warnings;
use File::Copy;

#dir1, dir2 passed from command line
my $dir1 = shift;
my $dir2 = shift;
#Varibles to keep count of things
my $count = 0;
my $cnt_FileExsists = 0;
my $cnt_FileCopied = 0;

#simple error checking and validation
die "Usage: $0 directory1 directory2\n" unless defined $dir2;
die "Not a directory: $dir1\n" unless -d $dir1;
die "Not a directory: $dir2\n" unless -d $dir2;

opendir DIR, "$dir1" or die "Could not open $dir1: $!\n";
while (my $file = readdir DIR){
  if (-e $dir2 .'/' . $file){
   #print $file . " exsists in " . $dir2 . "\n"; #debuging 
   $cnt_FileExsists++;
  }else{
   copy($dir1 . '/' . $file,$dir2 . '/' . $file) or die "Copy failed: $!";
   $cnt_FileCopied++;
   #print $file . " does not exsists in " . $dir2 . "\n"; #debuging 
  }
  $count++;
}
closedir DIR;

#ToDo: Clean up output. 
print "Total files: $count\nFiles not copied: $cnt_FileExsists\nFiles Copied: $cnt_FileCopied\n\n";

So have any of you ran into this before? What would cause this and how can it be fixed?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

意中人 2024-09-16 16:52:04

也许您要发送数据的分区的文件系统不支持非常大的数据。

Maybe the file system of partition you are send the data to do not support very large data.

白日梦 2024-09-16 16:52:03

在您的错误处理代码中,能否将 or die "Copy failed: $!"; 更改为 'or die "Copy failed: '$dir1/$file' to '$dir2/$file' :$!”;' ?

然后它应该告诉您错误发生在哪里。

然后检查两件事 -

1)它是否每次都在同一个文件上失败?

2)该文件有什么特殊之处吗?奇怪的名字?尺寸不寻常?不是普通文件?根本不是一个文件(正如另一个答案的理论)?

On your error handling code, could you please change or die "Copy failed: $!"; to 'or die "Copy failed: '$dir1/$file' to '$dir2/$file': $!";' ?

Then it should tell you where the error happens.

Then check 2 things -

1) Does it fail every time on the same file?

2) Is that file somehow special? Weird name? Unusual size? Not a regular file? Not a file at all (as the other answer theorized)?

苍白女子 2024-09-16 16:52:03

我不确定这是否与您的问题有关,但 readdir 将返回一个列表所有目录内容,包括子目录(如果存在)以及许多操作系统上的当前目录 (.) 和父目录 (..)。您可能正在尝试复制目录和文件。
以下不会尝试复制任何目录:

while (my $file = readdir DIR){
    next if -d "$dir1/$file";

I am not sure if this is related to your problem, but readdir will return a list of all directory contents, including subdirectories, if present, and the current (.) and parent directories (..) on many operating systems. You may be attempting to copy directories as well as files.
The following will not attempt to copy any directories:

while (my $file = readdir DIR){
    next if -d "$dir1/$file";
生生漫 2024-09-16 16:52:03

看来这是我所安装的服务器的 nfs 挂载的问题。我将 USB 驱动器连接到它,并且文件正在以极快的速度复制......如果您将 USB 2 算作极速的话。

Seems this was an issue either with my nfs mount of the server that it was mounted to. I hooked up a usb drive to it and the files are copying with extreme speed...if you count usb 2 as extreme.

时光清浅 2024-09-16 16:52:03

一个文件夹中包含 650 万张图像是非常极端的情况,无论是在 shell 还是 Perl 中,仅仅读取一个目录都会给机器带来负担。这是一个大文件夹结构。

我知道您现在正在寻求 Perl 中的解决方案,但是当处理 shell 中的大量文件时,您将需要利用 xargs 命令。通过将文件分组为可管理的块,它可以提供很大帮助。 http://en.wikipedia.org/wiki/Xargs

6.5 million images in one folder is very extreme and puts a load on the machine just to read a directory, whether it's in shell or Perl. That's one big folder structure.

I know you're chasing a solution in Perl now, but when dealing with that many files from the shell you'll want to take advantage of the xargs command. It can help a lot by grouping the files into manageable chunks. http://en.wikipedia.org/wiki/Xargs

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文