输入决策:使用 amazon s3 或类似的文件托管以及 php

发布于 2025-01-01 18:26:29 字数 787 浏览 0 评论 0原文

感谢您的评论帮助我做出以下决定。

我的要求:

  • 我有一个托管在共享服务器上的网站,我将向我的用户提供内容。大约 60 GB 的内容(大约 2000 个文件,每个文件 30 MB。用户一次只能访问 20 个文件),我计算出每月大约 100 GB 的带宽使用量。

  • 用户注册内容后,即可访问链接以供下载。但我希望链接在 7 天后过期,并且可以延长过期时间。

  • 我认为磁盘空间和带宽需要 Amazon S3 或 Rackspace Cloud 文件之类的服务(或者有替代方案吗?)

  • 为了管理过期时间,我计划以某种方式获取过期链接(我认为 S3 有该功能,而不是 Rackspace)或控制数据库上的过期日期并进行批处理每天都会重命名基于云和我的数据库中的所有 200 个文件(如果用户复制了直接链接,第二天就无法使用,只有我的网页会有更新的链接)。 PHP 用于编程。

那么你觉得怎么样?云文件托管是出路吗?哪一个?以这种方式管理链接是否有意义,或者通过编程来做到这一点太困难(向云服务器发送命令......)

编辑: 一些托管公司的共享计划中有无限的空间和带宽。我询问了他们的支持人员,他们说他们真的很尊重“无限”的交易。所以每月 100 GB 的传输量就可以了,唯一需要注意的是 CPU 使用率。因此,共享托管是另一种可供选择的选择。

后续: 因此,深入研究后,我发现无限计划的服务条款规定,不允许将该空间主要用于托管多媒体文件。所以我决定使用 Amazon s3 和 Tom Andersen 提供的解决方案。

感谢您的意见。

I appreciate your comments to help me decide on the following.

My requirements:

  • I have a site hosted on a shared server and I'm going to provide content to my users. About 60 GB of content (about 2000 files 30mb each. Users will have access to only 20 files at a time), I calculate about 100 GB monthly bandwidth usage.

  • Once a user registers for the content, links will be accessible for the user to download. But I want the links to expire in 7 days, with the posibility to increase the expiration time.

  • I think that the disk space and bandwidth calls for a service like Amazon S3 or Rackspace Cloud files (or is there an alternative? )

  • To manage the expiration I plan to somehow obtain links that expire (I think S3 has that feature, not Rackspace) OR control the expiration date on my database and have a batch process that will rename on a daily basis all 200 files on the cloud and on my database (in case a user copied the direct link, it won't work the next day, only my webpage will have the updated links). PHP is used for programming.

So what do you think? Cloud file hosting is the way to go? Which one? Does managing the links makes sense that way or it is too difficult to do that through programming (send commands to the cloud server...)

EDIT:
Some host companies have Unlimited space and Bandwidth on their shared plans.. I asked their support staff and they said that they really honor the "unlimited" deal. So 100 GB of transfer a month is ok, the only thing to look out is CPU usage. So going shared hosting is one more alternative to choose from..

FOLLOWUP:
So digging more into this I found that the TOS of the Unlimited plans say that it is not permitted to use the space primarily to host multimedia files. So I decided to go with Amazon s3 and the solution provided by Tom Andersen.

Thanks for the input.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

入画浅相思 2025-01-08 18:26:29

我个人认为您不一定需要为此采用基于云的解决方案。这可能有点贵。您只需购买专用服务器即可。我想到的一家提供商在其一些最低级别的计划中提供了 3,000 GB/月的带宽。这是在 10Mbit 上行链路上;您可以以 10 美元/月的价格升级到 100Mbps,也可以以 20 美元/月的价格升级到 1Gbit。我不会提及任何名称,但您可以搜索专用服务器,并可能找到您喜欢的服务器。

至于使文件过期,只需在数据库支持的 PHP 中实现即可。您不必四处移动文件,将所有文件存储在无法从 Web 访问的目录中,并使用 PHP 脚本来确定链接是否有效,如果有效,则读取文件的内容并将其传递到浏览器。如果链接无效,您可以显示错误消息。这是一个非常简单的概念,我认为有很多预先编写的脚本可以做到这一点,但根据您的需要,自己做并不难。

云托管有优势,但现在我认为它的成本很高,如果您不尝试在地理上分散负载或计划支持数千个并发用户并需要云的弹性,您可以使用专用服务器来代替。

希望有帮助。

I personally don't think you necessarily need to go to a cloud based solution for this. It may be a little costly. You could simply get a dedicated server instead. One provider that comes to mind gives 3,000 GB/month of bandwidth on some of their lowest level plans. That is on a 10Mbit uplink; you can upgrade to 100Mbps for $10/mo of 1Gbit for $20/mo. I won't mention any names, but you can search for dedicated servers and possibly find one to your liking.

As for expiring the files, just implement that in PHP backed by a database. You won't have to move files around, store all the files in a directory not accessible from the web, and use a PHP script to determine if the link is valid, and if so read the contents of the file and pass them through to the browser. If the link is invalid, you can show an error message instead. It's a pretty simple concept and I think there are a lot of pre-written scripts that do that available, but depending on your needs, it isn't too difficult to do it yourself.

Cloud hosting has advantages, but right now I think its costly and if you aren't trying to spread the load geographically or plan on supporting thousands of simultaneous users and need the elasticity of the cloud, you could possibly use a dedicated server instead.

Hope that helps.

战皆罪 2025-01-08 18:26:29

我不能代表 S3,但我使用 Rackspace Cloud 文件和服务器。

好处是您无需为传入带宽付费,因此上传非常便宜。

我会这样做:

  1. 将您需要的所有文件上传到“私有”容器
  2. 创建一个启用 CDN 的公共容器
  3. 这将为您提供一个特殊的 url,例如 http://c3214146.r65.ce3.rackcdn.com
  4. 为您的域创建自己的 CNAME DNS 记录指向例如: http://cdn.yourdomain.com
  5. 当用户请求文件时,使用 COPY api 操作 带有 长随机文件名从私有服务器端进行复制容器到公共容器。
  6. 将文件名存储在应用程序的 mysql 数据库中
  7. 一旦文件过期,请使用 删除api操作,然后PURGE api 操作来获取它退出CDN..最终从mysql表中删除该记录。

使用 PURGE 命令..我听说它不能 100% 地工作,它可能会将文件多保留一天..还在文档中说保留它仅用于紧急情况。

编辑:我刚刚听说,每天有 25 次清除的限制。

但就我个人而言,我刚刚对对象使用了删除,发现立即将其从 CDN 中删除。综上所述,最坏的情况是文件删除后24小时内在某些CDN节点上仍然可以访问。

编辑:您可以更改 CDN 节点上的 TTL(缓存时间)。默认值为 72 小时,因此可能需要付费将其设置为更低的值。但不要太低,以免失去 CDN 的优势。

我发现 CDN 的优点是:

  1. 它将内容直接推送给远离美国服务器的最终用户,并为他们提供超快的下载时间
  2. 如果您有一个超级流行的文件.. 当 1000 时它不会删除您的网站人们开始尝试下载它......因为他们都会收到从他们最接近的 CDN 节点推出的副本。

I can't speak for S3 but I use Rackspace Cloud files and servers.

It's good in that you don't pay for incoming bandwidth, so uploads are super cheap.

I would do it like this:

  1. Upload all the files you need to a 'private' container
  2. Create a public container with CDN enabled
  3. That'll give you a special url like http://c3214146.r65.ce3.rackcdn.com
  4. Make your own CNAME DNS record for your domain point to that, like: http://cdn.yourdomain.com
  5. When a user requests a file, use the COPY api operation with a long random filename to do a server side copy from the private container to the public container.
  6. Store the filename in a mysql DB for your app
  7. Once the file expires, use the DELETE api operation, then the PURGE api operation to get it out of the CDN .. finally delete the record from the mysql table.

With the PURGE command .. I heard it doesn't work 100% of the time and it may leave the file around for an extra day .. also in the docs it says to reserve it's use for only emergency things.

Edit: I just heard, there's a 25 purge per day limit.

However personally I've just used delete on objects and found that took it out the CDN straight away. In summary, the worst case would be that the file would still be accessible on some CDN nodes for 24 hours after deletion.

Edit: You can change the TTL (caching time) on the CDN nodes .. default is 72 hours so might pay to set it to something lower .. but not so low that you loose the advantage of CDN.

The advantages I find with the CDN are:

  1. It pushes content right out to end users far away from the USA servers and gives super fast download times for them
  2. If you have a super popular file .. it won't take out your site when 1000 people start trying to download it .. as they'd all get copies pushed out the whatever CDN node they were closest to.
独﹏钓一江月 2025-01-08 18:26:29

您不必每天都在 S3 上重命名文件。只需将它们设置为私有(默认设置),然后将一天或一周的限时 URL 分发给任何获得授权的人即可。

我会考虑将链接设置为仅有效 20 分钟,以便用户必须重新登录才能重新下载文件。那么他们甚至无法分享从您那里获得的链接。

You don't have to rename the files on S3 every day. Just make them private (which is default), and hand out time limited urls for day or a week to anyone who is authorized.

I would consider making the links only good for 20 mins, so that a user has to re-login in order to re-download the files. Then they can't even share the links they get from you.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文