使用 Fabric 写入远程文件

发布于 2024-11-13 05:27:21 字数 945 浏览 2 评论 0原文

我正在尝试备份数据库并使用 Fabric 将它们移动到不同的服务器。

在远程服务器上时,打开文件进行写入失败并出现错误。

newFile = open('%s%s' % (dumpPath,newFileName) ,'w')
IOError: [Errno 2] No such file or directory: '/home/ec2-user/dbbackup.sql.bz2'

该文件存在,我什至尝试事先创建,以防结构没有创建权限,但它仍然不起作用

 run("touch dbbackup.sql.bz2")

编辑:我知道我可以将文件上传到远程服务器,但这不是我想要做的打开命令。我正在尝试压缩一个大文件(数据库转储)是否可以在远程服务器上执行此操作,或者我是否必须将数据库转储复制到本地主机,在那里压缩,然后上传回来。这是本地主机上的压缩:

compObj= bz2.BZ2Compressor()
newFile = open('%s%s' % (dumpPath,newFileName) ,'w')
dbFile = file( '%s%s' % (dumpPath,filename), "r" )
block= dbFile.read( BLOCK_SIZE )
while True: #write the compressed data
        cBlock= compObj.compress( block )
        newFile.write(cBlock)
        block= dbFile.read( BLOCK_SIZE )
        if not block:
            break
    cBlock= compObj.flush()

I am trying to backup databases and move them around to different servers using Fabric.

When on a remote server, to open a file for writing it fails with the error.

newFile = open('%s%s' % (dumpPath,newFileName) ,'w')
IOError: [Errno 2] No such file or directory: '/home/ec2-user/dbbackup.sql.bz2'

That files exists, and I even tried creating beforehand just in case fabric didnt have permissions to create, but it still didnt work

 run("touch dbbackup.sql.bz2")

EDIT: I know that I can upload files on to a remote server but thats not what I am trying to do with the open command. I am trying to compress a large file (a database dump) Is it possible to do this on the remote server, or would I have to copy the DB dump to the local host, compress there and then upload back. Here is compression on local host:

compObj= bz2.BZ2Compressor()
newFile = open('%s%s' % (dumpPath,newFileName) ,'w')
dbFile = file( '%s%s' % (dumpPath,filename), "r" )
block= dbFile.read( BLOCK_SIZE )
while True: #write the compressed data
        cBlock= compObj.compress( block )
        newFile.write(cBlock)
        block= dbFile.read( BLOCK_SIZE )
        if not block:
            break
    cBlock= compObj.flush()

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

贪了杯 2024-11-20 05:27:21

在 Fabric 中,您永远不会“在远程服务器上”。有些 Fabric 命令在本地运行,有些在远程服务器上运行。在本例中,您使用的是 Python 的 open 函数,该函数尝试在本地计算机上打开文件,并且可以理解地失败了。您可以使用 Fabric 的 put 和 get 函数 在本地计算机和远程服务器。

In Fabric, you are never "on a remote server". Some Fabric commands run locally, and some run on the remote server. In this case, you are using Python's open function, which tries to open the file on your local computer, and understandably fails. You can use Fabric's put and get functions to move files between your local computer and the remote server.

忆离笙 2024-11-20 05:27:21

不知道是否可以远程打开文件。但即使可以,在您的情况下也可能不是一个好主意,因为您将通过 ssh 获取大文件(请记住 Fabric 仍在您的本地计算机上运行)。为什么不远程压缩文件,然后获取压缩文件呢?对于 mysqldump,它看起来像这样:(

run('mysqldump [options] | gzip > outputfile.sql.gz')
get('outputfile.sql.gz')

有关 mysqldump 和 gzip 的更多信息,请参见:压缩 mysqldump 输出 )

I don't know if you can open a file remotely. But even if you can, it may not be a good idea in your case, since you will be fetching the large file over ssh (remember that Fabric is still running on your local machine). Why not compress the file remotely, and then get the compressed file? In case of mysqldump, it would look like this:

run('mysqldump [options] | gzip > outputfile.sql.gz')
get('outputfile.sql.gz')

(more on mysqldump and gzip here: Compressing mysqldump output )

心病无药医 2024-11-20 05:27:21
  1. 您需要再次阅读 Fabric 教程。
  2. 您应该使用 os.path.join 来组装文件路径。
  3. 该 open() 调用尝试打开本地计算机上的文件,而不是远程服务器上的文件。
  1. You need to read the Fabric tutorial again.
  2. You should be using os.path.join to assemble your filepath.
  3. That open() call is trying to open the file on your local machine, NOT the remote server.
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文