使用 Java API 在 Hadoop 中移动文件?

发布于 2024-10-28 17:29:22 字数 177 浏览 1 评论 0原文

我想使用 Java API 在 HDFS 中移动文件。我想不出办法做到这一点。 FileSystem 类似乎只想允许在本地文件系统之间移动。但我想将它们保留在 HDFS 中并将它们移动到那里。

我错过了一些基本的东西吗?我能想到的唯一方法是从输入流中读取它并将其写回......然后删除旧副本(恶心)。

谢谢

I want to move files around in HDFS using the Java APIs. I cannot figure out a way to do this. The FileSystem class only seems to want to allow moving to and from the local file system.. but I want to keep them in HDFS and move them there.

Am I missing something basic? The only way I can figure to do it is to read it from the input stream and write it back out... and then delete the old copy (yuck).

thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

最初的梦 2024-11-04 17:29:23

使用 FileSystem.rename():

public abstract boolean rename(Path src, Path dst) 抛出 IOException

将路径src重命名为路径dst。可以发生在本地文件系统或远程DFS上。

参数:
src - 要重命名的路径
dst - 重命名后的新路径
退货:
true 如果重命名成功
抛出:
IOException - 上失败

Use FileSystem.rename():

public abstract boolean rename(Path src, Path dst) throws IOException

Renames Path src to Path dst. Can take place on local fs or remote DFS.

Parameters:
src - path to be renamed
dst - new path after rename
Returns:
true if rename is successful
Throws:
IOException - on failure

后eg是否自 2024-11-04 17:29:23

java.nio.* 方法可能并不总是适用于 HDFS。所以找到了以下有效的解决方案。

使用 org.apache.hadoop.fs.FileUtil.copy API 将文件从一个目录移动到另一个目录

val fs = FileSystem.get(new Configuration())
        val conf = new org.apache.hadoop.conf.Configuration()
        val srcFs = FileSystem.get(new org.apache.hadoop.conf.Configuration())
        val dstFs = FileSystem.get(new org.apache.hadoop.conf.Configuration())
        val dstPath = new org.apache.hadoop.fs.Path(DEST_FILE_DIR)

        for (file <- fileList) {
          // The 5th parameter indicates whether source should be deleted or not
          FileUtil.copy(srcFs, file, dstFs, dstPath, true, conf)

The java.nio.* approach may not work on HDFS always. So found the following solution that works.

Move files from one directory to another using org.apache.hadoop.fs.FileUtil.copy API

val fs = FileSystem.get(new Configuration())
        val conf = new org.apache.hadoop.conf.Configuration()
        val srcFs = FileSystem.get(new org.apache.hadoop.conf.Configuration())
        val dstFs = FileSystem.get(new org.apache.hadoop.conf.Configuration())
        val dstPath = new org.apache.hadoop.fs.Path(DEST_FILE_DIR)

        for (file <- fileList) {
          // The 5th parameter indicates whether source should be deleted or not
          FileUtil.copy(srcFs, file, dstFs, dstPath, true, conf)
讽刺将军 2024-11-04 17:29:23

我认为 FileUtilts ReplaceFile 也可以解决这个目的。
http ://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileUtil.html#replaceFile(java.io.File, java.io.File)

I think the FileUtilts replaceFile would also solve the purpose.
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileUtil.html#replaceFile(java.io.File, java.io.File)

铜锣湾横着走 2024-11-04 17:29:23
hdfsDirectory="hdfs://srcPath"   
 val conf = new org.apache.hadoop.conf.Configuration()
        val src:Path = new org.apache.hadoop.fs.Path(hdfsDirectory)
        val fs = FileSystem.get(src.toUri,conf)
        val srcPath: Path = new Path("hdfs://srcPath")
        val srcFs =FileSystem.get(srcPath.toUri,conf)
        val dstPath:Path =new Path("hdfs://targetPath/")
        val dstFs =FileSystem.get(dstPath.toUri,conf)
        val exists = fs.exists(new org.apache.hadoop.fs.Path(hdfsDirectory))
        val status:Array[FileStatus] = fs.listStatus(new Path(hdfsDirectory))
        if (status.length>0) {
          status.foreach(x => {
            println("My files: " + x.getPath)
            FileUtil.copy(srcFs, x.getPath, dstFs, dstPath, true, conf)
            println("Files moved !!" +x.getPath)
          }
          )}
        else{
          println("No Files Found !!")
        }
hdfsDirectory="hdfs://srcPath"   
 val conf = new org.apache.hadoop.conf.Configuration()
        val src:Path = new org.apache.hadoop.fs.Path(hdfsDirectory)
        val fs = FileSystem.get(src.toUri,conf)
        val srcPath: Path = new Path("hdfs://srcPath")
        val srcFs =FileSystem.get(srcPath.toUri,conf)
        val dstPath:Path =new Path("hdfs://targetPath/")
        val dstFs =FileSystem.get(dstPath.toUri,conf)
        val exists = fs.exists(new org.apache.hadoop.fs.Path(hdfsDirectory))
        val status:Array[FileStatus] = fs.listStatus(new Path(hdfsDirectory))
        if (status.length>0) {
          status.foreach(x => {
            println("My files: " + x.getPath)
            FileUtil.copy(srcFs, x.getPath, dstFs, dstPath, true, conf)
            println("Files moved !!" +x.getPath)
          }
          )}
        else{
          println("No Files Found !!")
        }
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文