自托管 S3 替代方案

发布于 2024-09-12 15:02:16 字数 617 浏览 6 评论 0原文

我正在寻找依赖于 RESTful API 的 S3 替代方案,以便我可以简单地插入诸如 http:// datastore1.example.com/ID 并且可以直接下载。 我研究过 RIAK 和 Bitcache。它们看起来都非常好: http://bitcache.org/api/rest 但它们有一个问题。我想成为唯一可以上传数据的人。否则任何人都可以通过发送 PUT 请求来使用我们的数据存储。

有没有办法配置 RIAK,以便每个人都可以“获取”但不是每个人都可以放置或删除文件(除了我)?还有其他可以推荐的服务吗?

还添加赏金 :)

要求:

  • RESTful API
  • 来宾 GET 仅
  • 在 Debian 上运行

非常高兴拥有:

  • 自动分发

编辑:澄清一下,我不想与 S3 连接,我有很棒的服务器,配有硬盘驱动器和非常好的网络连接( 3Gbps)我不需要S3..

I am looking for an S3 alternative which relies on a RESTful API, so that I can simply insert links such as http://datastore1.example.com/ID and they are directly downloadable.
I have looked at RIAK and Bitcache. They both seem very nice: http://bitcache.org/api/rest but they have one problem. I want to be the only one who can upload data. Else anyone could use our datastore by sending a PUT Request.

Is there a way to configure RIAK so that everyone can "GET" but not everyone can PUT or DELETE files except me? Are there other services which you can recommend?

Also adding Bounty :)

Requirements:

  • RESTful API
  • Guests GET only
  • Runs on Debian

Very nice to have:

  • auto distributed

EDIT: To clarify I don't want any connection to S3 I have great servers just lying around with harddrives and very good network connection (3Gbps) I don't need S3..

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

怀中猫帐中妖 2024-09-19 15:02:16

Riak 作者建议在前面放置一个 HTTP 代理Riak 的目的是提供访问控制。您可以选择您喜欢的任何代理服务器(例如 nginx 或 Apache),以及您喜欢的任何访问控制策略(例如基于 IP 地址的授权、HTTP 基本身份验证或 cookie,假设您的代理服务器可以处理)。例如,在 nginx 中,您可以指定 limit_except (同样 Apache 中的 LimitException)。

或者,您也可以直接向 Riak 添加访问控制。它基于 Webmachine,因此一种方法是实现 is_authorized

The Riak authors recommend to put a HTTP proxy in front of Riak in order to provide access control. You can chose any proxy server you like (such as nginx or Apache), and any access control policy you like (such as authorization based on IP addresses, HTTP basic auth, or cookies, assuming your proxy server can handle it). For example, in nginx, you might specify limit_except (likewise LimitExcept in Apache).

Alternatively, you could also add access control to Riak directly. It's based on Webmachine, so one approach would be to implement is_authorized.

在你怀里撒娇 2024-09-19 15:02:16

根据您提供的信息,我建议使用 Eucalyptus ( http://open.eucalyptus.com/ ).他们确实有 S3 兼容的存储系统。

Based on the information that you have given, I would suggest Eucalyptus ( http://open.eucalyptus.com/ ). They do have an S3 compatible storage system.

终弃我 2024-09-19 15:02:16

可靠的分布式对象存储 RADOS 是 ceph 文件系统的一部分,提供S3 网关

我们使用了 Eucalyptus 存储系统 Walrus,但是我们遇到了可靠的问题。

The reliable, distributed object store RADOS, which is part of the ceph file system, provides an S3 gateway.

We used the Eucalyptus storage system, Walrus, but we had reliably problems.

终遇你 2024-09-19 15:02:16

如果您正在寻找分布式文件系统,为什么不尝试hadoop hdfs呢?

http://hadoop.apache.org/common/docs/r0.17.0 /hdfs_design.html

有一个可用的 Java API:

http://hadoop.apache.org/common/docs/r0.20.2/api/org/apache/hadoop/fs/FileSystem.html

目前,安全是一个问题 -至少如果您有权访问终端:

http://developer.yahoo.com /hadoop/tutorial/module2.html#perms

但是您可以部署 hdfs,在其前面放置一个应用程序服务器(使用 Java API)(GlassFish)并使用 Jersey 构建 RESTful API:

http://jersey.java.net/

如果您有兴趣构建这样的东西,请告诉我,因为我可能很快就会构建类似的东西。

您可以使用 Cloudera Hadoop 发行版让生活变得更轻松:

http://www.cloudera.com/hadoop /

问候,
J。

If you are looking for a distributed file system, why don't you try hadoop hdfs?

http://hadoop.apache.org/common/docs/r0.17.0/hdfs_design.html

There is a Java API available:

http://hadoop.apache.org/common/docs/r0.20.2/api/org/apache/hadoop/fs/FileSystem.html

Currently, security is an issue - at least if you have access to a terminal:

http://developer.yahoo.com/hadoop/tutorial/module2.html#perms

But you could deploy hdfs, put an application server (using the Java API) in front of it (GlassFish) and use Jersey to build the RESTful API:

http://jersey.java.net/

If you're interested in building such a thing, please let me know, for I may be building something similar quite soon.

You can use the Cloudera Hadoop Distribution to make life a bit more easy:

http://www.cloudera.com/hadoop/

Greetz,
J.

花落人断肠 2024-09-19 15:02:16

我想您应该在 serverfault.com 上提出您的问题,因为它与系统更多相关。
不管怎样,我可以建议你使用 mogileFS,它的扩展性非常好: http://danga.com/mogilefs/

I guess that you should ask your question on serverfault.com , as it's more system related.
Anyway, I can suggest you mogileFS which scales very well : http://danga.com/mogilefs/ .

[浮城] 2024-09-19 15:02:16

WebDAV 是最 RESTful 的,并且有许多实现可以扩展到各种用途。无论如何,如果是 REST 并且是 HTTP,那么服务器支持的任何身份验证方案都应该允许您控制谁可以上传。

WebDAV is about as RESTful as it gets and there are many implementations that scale to various uses. In any case, if it is REST and it is HTTP then whatever authentication scheme that the server supports should allow you to control who can upload.

清晨说晚安 2024-09-19 15:02:16

您可以自行将其开发为网络应用程序或现有应用程序的一部分。它将使用 HTTP 请求,检索其 URI 组件,将其转换为 S3 对象名称并使用 getObject() 获取其内容(使用可用的 S3 SDK 之一,例如 AWS Java SDK )。

您可以尝试托管解决方案 - s3auth.com(我是开发人员)。这是一个开源项目,您可以在 其核心类之一。 HTTP 请求由服务处理,然后重新转换为 Amazon S3 内部身份验证方案。

You can develop it yourself as a web app or a part of your existing application. It will consume HTTP requests, retrieve their URI component, convert it to S3 object name and use getObject() to get its content (using one of available S3 SDKs, for example AWS Java SDK).

You can try a hosted solution - s3auth.com (I'm a developer). It's an open source project, and you can see how this mechanism is implemented internally at one of its core classes. HTTP request is processed by the service and then re-translated to Amazon S3 internal authentication scheme.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文