为 PHP 网站提供远程 PHP 文件服务
情况是这样的:
我有一个 LAMP 服务器,它提供 HTML、PHP 等服务...现在我有远程文件夹,位于 Web 的某个位置,其中有一个充满 PHP 文件、图像、MVC 文件夹结构的目录(CodeIgniter)等等...
现在,我想做的是,我不想每次都想提供这些 PHP 文件,也不想下载它们并将它们上传到我的 LAMP 服务器中,而是直接使用这些 PHP 文件并将它们提供给我的 LAMP 服务器。
同样,我想要来自另一台服务器中的文件夹的 PHP 文件,我只能访问每个单独文件的直接链接,这些文件在我的 LAMP 服务器中提供,因此如果我访问我的网站,例如:www.website.com /page1,从远程Web服务器或所有PHP文件获取文件夹结构,并在我的服务器中提供服务。
我知道这听起来有点复杂,但我不知道该使用什么......也许反向代理?你认为我可以直接下载文件并不断同步文件吗?如果有人提出了好的解决方案,我什至可能会付钱给那个人...
编辑(1) 到目前为止,答案很好......但我认为我没有提出一个好问题,所以这里再说一遍:
我可以访问 PHP 文件的“列表”,为了获取它们,我需要通过 PHP 使用誓言来验证自己的身份。一旦我通过身份验证,我就可以检索 PHP、html 等文件的列表,每个文件都有一个任何人都可以访问的公共 URL。因此,我的想法是,我不想下载该存储库中的所有文件并提供这些文件,而是希望能够重用该存储库的 Web 空间,而我自己只提供这些文件。所以基本上我希望能够拥有指向 url 的符号链接,我认为这是不可能的,但能够读取文件并为 PHP 逻辑提供服务,即使文件位于其他地方。
我担心涉及的安全问题,但如果有人可以帮助我,我将不胜感激...另外,如果您对我正在做的事情感兴趣,我总是可以为这个项目使用合作伙伴,我打算在这个项目中使用它施舍,但仍然可以支付给那个人。
This is the situation:
I have a LAMP server, which serves HTML, PHP, etc... Now I have remote folder, somewhere in the web, which has a directory full of PHP files, images, an MVC folder structure (CodeIgniter), etc...
Now, What I want to do is that instead of every time I want to serve those PHP files, instead of downloading them and uploaded them into my LAMP server, I want to use those PHP files directly and serve them in my LAMP server.
Again, I want the PHP files from a folder in another server, which I only have access to the direct link to each individual file, being serve in my LAMP server, so if I access my website, for instance: www.website.com/page1, gets the folder structure from the remote web server or all PHP files, and get serve within my server.
I know this sounds a little bit complicated but I'm not sure what to use... Maybe reverse proxy? Do you think I may download the files directly and constantly syncing the files? If anyone gets with a good solution I may even pay that person...
EDIT(1)
Good answers so far... but I think I did not make a good question so here it goes again:
I have access to a "list" of PHP files, and in order to get them I need to authenticate myself using oath via PHP. Once I get authenticated, I can retrieve a list of PHP, html, etc.. files, each one of them having a public URL that anyone can access. So the think is that instead of downloading all files in that repository, and serve those files, I want to be able to reuse that repository's web space and I just serve these files myself. So basically I want to be able to have symbolic links to urls, which I think is not possible, but being able to just read the files and serve the PHP logic, even though the files are elsewhere.
I'm concern about the security issues involved, but if someone could help me I will be thankful... Also if you are interested in what I'm doing I always can use a partner for this project which I intent to use it in charity, but still can pay that person.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
这不是明智之举。您会面临潜在的安全问题,但至少,您的网站速度会显着减慢。
我建议您只需编写脚本,通过 SSH 同步两台服务器上的文件。
编辑: ManseUK 的建议,如果 rsync 也是一个不错的选择。
This is not a smart thing to do. You open yourself up to potential security issues, but at a minimum, you will significantly slow your site down.
I would recommend that you simply script synchronizing the files on both servers over SSH by a script.
Edit: ManseUK's suggestion if rsync is also a good one.
如果您可以通过 ftp 访问远程服务器,则可以使用 fusion 挂载该文件夹,并像往常一样为 apache 服务。
If you have ftp access to the remote server, you could mount the folder using fuse, and serve as usual for apache.
您是否能够将远程文件夹安装为 NFS 卷,或者可能使用 SSHFS?如果这些选项可用,那么其中任何一个都适合您。您可以在本地安装远程文件夹,并告诉本地 Web 服务器从该路径提供文件。这并不是世界上最有效的设置,但我不知道为什么您要这样做这一切从一开始就分裂了。 ;)您可以编写一个 cronjob 来每隔 X 分钟/小时/天获取远程文件列表,然后将结果存储在本地,然后编写一个简单的脚本来根据请求解析这些结果。或者,您仍然可以使用 NFS 或 SSHFS 安装来实时读取远程路径并构建您需要的任何 URL。
Do you have the ability to mount the remote folder as an NFS volume, or perhaps with SSHFS? If those options are available, either could work for you. You'd mount the remote folder locally and tell your local web server to serve files from that path.Not that it would be the most efficient setup in the world, but I don't know why you have all this split apart in the first place. ;)You could write a cronjob to grab the remote file list every X minutes/hours/days then store the results locally, then write a simple script to parse those results upon request. Alternatively, you could still use an NFS or SSHFS mount to read the remote paths in real time and build whatever URL's you need.