FTP 连接“挂起”在“列表”上

发布于 2024-09-18 00:05:53 字数 880 浏览 4 评论 0原文

总而言之,

我的远程 ftp 服务器有一个问题,这让我忙了三天,我快要疯了。 :(

不久前,我编写了一个简单的 ftp 检索器类,它使用 apache commons-net 2.0。该类在 5 个不同的 ftp 服务器上运行良好,我可以根据需要检索数据。 现在我遇到了一个需要连接的服务器,但它不允许我列出目录或检索数据。

这是我的类发送和检索的命令的顺序:

220 (vsFTPd 2.0.1)
USER XXXXXXX
331 Please specify the password.
PASS XXXXXXX
230 Login successful
TYPE I
200 Switching to Binary mode.
PASV
227 Entering Passive Mode (XXX,XXX,XXX,XXX,XXX,XXX)
NLST
150 Here comes the directory listing.
226 Directory send OK.
SYST
215 UNIX Type: L8
PASV
227 Entering Passive Mode (XXX,XXX,XXX,XXX,XXX,XXX)
LIST
150 Here comes the directory listing.

在最后一行,我的代码无限期地挂起(好吧,我在等待 2 小时后杀死它,看看它会阻塞多长时间)。我已经尝试了一切,从使用活动连接到设置 ASCII 类型再到使用不同的 ftp 库 - 总是得到相同的结果。

通常,我只会打电话给他们并告诉他们他们的服务器配置不正确。然而,通过 FileZilla 连接不仅有效,而且速度快如闪电,永远不会引起任何问题。此外,在 Linux 上通过命令行连接也很神奇。

我在这里完全迷路了。有人知道我为什么会遇到这个问题吗?

干杯

All,

I have an issue with a remote ftp server that has kept me busy for three days now and I am going nuts over it. :(

A while ago, I wrote a simple ftp retriever class that uses apache commons-net 2.0. The class works fine on 5 different ftp servers, I can retrieve data as I want.
Now I have come across a server that I need to connect to that just won't let me list directories or retrieve data.

This is the order of commands that are being sent and retrieved by my class:

220 (vsFTPd 2.0.1)
USER XXXXXXX
331 Please specify the password.
PASS XXXXXXX
230 Login successful
TYPE I
200 Switching to Binary mode.
PASV
227 Entering Passive Mode (XXX,XXX,XXX,XXX,XXX,XXX)
NLST
150 Here comes the directory listing.
226 Directory send OK.
SYST
215 UNIX Type: L8
PASV
227 Entering Passive Mode (XXX,XXX,XXX,XXX,XXX,XXX)
LIST
150 Here comes the directory listing.

At the last line, my code hangs indefinitely (well, I killed it after 2 hours of waiting to see how long it would block). I have tried everything, from using an active connection to setting ASCII type to using different ftp libraries - always with the same result.

Normally, I would just call the guys and tell them that their server is configured incorrectly. However, connecting via FileZilla not only works but is lightning fast and never causes any problems. Also, connecting via command line on linux works like a charm.

I am totally lost here. Does anybody have any ideas why I have this problem?

Cheers

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

凹づ凸ル 2024-09-25 00:05:53

我不敢相信我花了将近五天的时间在这上面。经过长时间的回滚更改、提交中间版本、调试和大约 15923 杯咖啡后,我终于找到了所有这些混乱的原因。

事实证明,无论出于何种原因,只要您将 xpp3 驱动程序(如 XStream)打包到您的耳朵中并将其部署到 JBoss 5.1 上,就可以通过任何 ftp 进行任何连接图书馆会变得混乱。

我不知道这是由其他库干扰 xpp3 引起的还是 xpp3 本身引起的。坦白说,我现在也不在乎。我所知道的是,一旦我从我的项目中消除了这种依赖,一切都会变得神奇起来。

该死的,xpp3——我要起诉你,赔偿你让我失去的十年生命! :)

谢谢大家的帮助,我现在要回家了......

I cannot believe that I spent almost five days on this. After long sessions of rolling back changes, committing intermediate versions, debugging and about 15923 cups of coffee, I finally found the reason for all this mess.

It turns out that - for whatever reason - as soon as you package xpp3 drivers (as in XStream) in your ear and deploy this on JBoss 5.1, any connection via any ftp libraries will get messed up.

I have no idea if this is caused by other libraries interfering with xpp3 or if it is xpp3 itself. Frankly, I could not care less, either at the moment. All I know is that as soon as I removed that dependency from my project everything worked like a charm.

Damn you, xpp3 - I will sue you for the ten years of my life you cost me! :)

Thanks all for your help, I am going home now...

悲歌长辞 2024-09-25 00:05:53

建议:在客户端计算机上安装 Wireshark,并在工作(filezilla)和非工作条件下捕获网络跟踪,看看有什么不同。如果您使用的是 Linux,请使用 tcpdump 命令捕获数据包,然后使用 Wireshark 来检查它们。

Suggestion: install Wireshark on the client machine and capture network traces under both working (filezilla) and non-working conditions to see what's different. If you're on Linux use the tcpdump command to capture the packets and then use Wireshark to examine them.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文