使用python将文件从FTPS传输到SFTP
我正在做一些性能测试,以将大文件(~ 4 GB)从 FTPS 传输到 SFTP 服务器。 我做了一些研究并尝试了 python 脚本,看看从 FTPS 获取文件并传输到 SFTP 是否有任何性能改进。
FTPS 连接设置
def create_connection(self):
print('Creating session..........')
ftp = ftplib.FTP_TLS()
# ftp.set_debuglevel(2)
ftp.connect(self.host, self.port)
ftp.login(self.user, self.passwd)
ftp.prot_p()
# optimize socket params for download task
print('Optimizing socket..........')
ftp.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
ftp.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 75)
ftp.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 60)
print('Session created successfully')
return ftp
def get_file(self, ftp_session, dst_filename, local_filename):
print('Starting download........', datetime.now())
myfile = BytesIO()
print(myfile.tell())
ftp_session.retrbinary('RETR %s' % dst_filename, myfile.write)
print(myfile.tell())
print('Download completed ........', datetime.now())
对于 SFTP 连接,我使用的是 paramiko,
host, port = "abc.com", 22
transport = paramiko.Transport((host, port))
username, password = "user", "pwd"
transport.connect(None, username, password)
transport.default_window_size = 3 * 1024 * 1024
sftp = paramiko.SFTPClient.from_transport(transport)
myfile.seek(0)
sftp.putfo(fl=myfile, remotepath='remotepath/' + local_filename)
sftp.close()
我使用的是 BytesIO,这样我就可以将文件保存在内存中并在复制时对其进行流式传输。以下代码可以复制文件,但大约需要 20 分钟。该代码首先将文件复制到内存中,然后进行传输。有没有什么方法可以更有效地传输文件?
I am doing some performance test to transfer large files (~ 4 GB) from FTPS to SFTP server.
I did some research and tried python script to see if there is any performance improvement to get a file from FTPS and transfer to SFTP.
FTPS connection setup
def create_connection(self):
print('Creating session..........')
ftp = ftplib.FTP_TLS()
# ftp.set_debuglevel(2)
ftp.connect(self.host, self.port)
ftp.login(self.user, self.passwd)
ftp.prot_p()
# optimize socket params for download task
print('Optimizing socket..........')
ftp.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
ftp.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 75)
ftp.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 60)
print('Session created successfully')
return ftp
def get_file(self, ftp_session, dst_filename, local_filename):
print('Starting download........', datetime.now())
myfile = BytesIO()
print(myfile.tell())
ftp_session.retrbinary('RETR %s' % dst_filename, myfile.write)
print(myfile.tell())
print('Download completed ........', datetime.now())
For SFTP connection I am using paramiko
host, port = "abc.com", 22
transport = paramiko.Transport((host, port))
username, password = "user", "pwd"
transport.connect(None, username, password)
transport.default_window_size = 3 * 1024 * 1024
sftp = paramiko.SFTPClient.from_transport(transport)
myfile.seek(0)
sftp.putfo(fl=myfile, remotepath='remotepath/' + local_filename)
sftp.close()
I am using BytesIO so that I can keep the file in memory and stream it while copying. The following code can copy the file but it is taking ~ 20 mins. The code is first copy the file in memory and then its transferring. Is there any possible way to transfer file more efficiently ?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
经过一些简短的谷歌搜索(是的,谷歌确实有效=)我偶然发现了这个线程:
Paramiko 无法下载大于 1GB 的大文件
After some short google searches (yes, google does really work=) I have stumbled across this thread:
Paramiko Fails to download large files >1GB